Gradient Descent

Gradient Descent for Optimization
📘

Gradient Descent:
It is a first order iterative optimization algorithm that is used to find the local minimum of a differentiable function.
It iteratively adjusts the parameters of the model in the direction opposite to the gradient of cost function, since moving opposite to the direction of gradient leads towards the minima.

Algorithm:

  1. Initialize the weights/parameters with random values.

  2. Calculate the gradient of the cost function at current parameter values.

  3. Update the parameters using the gradient.

    \[ w_{new} = w_{old} - \eta \cdot \frac{\partial f}{\partial w_{old}} \\[10pt] \eta: \text{ learning rate or step size to take for each parameter update} \]
  4. Repeat steps 2 and 3 iteratively until convergence (to minima).







End of Section