What are the steps involved in the gradient descent algorithm?

Gradient descent is an optimization algorithm that is used to find the coefficients of parameters that are used to reduce the cost function to a minimum.

Step 1: Allocate weights (x,y) with random values and calculate the error (SSE)

Step 2: Calculate the gradient, i.e., the variation in SSE when the weights (x,y) are changed by a very small value. This helps us move the values of x and y in the direction in which SSE is minimized

Step 3: Adjust the weights with the gradients to move toward the optimal values where SSE is minimized

Step 4: Use new weights for prediction and calculating the new SSE

Step 5: Repeat Steps 2 and 3 until further adjustments to the weights do not significantly reduce the error