### Gradient Descent for Linear Regression

Gradient Descent is defined as one of the most commonly used iterative optimization algorithm of machine learning to train the machine learning and deep learning models. It helps in finding the local minima of a function.

The best way to define the local minima or local maxima of a function using gradient descent is as follow:

• If we move towards a negative gradient or away from the gradient of the function at the current point, it will give the local minima of that function.
• Whenever we move towards a positive gradient or towards the gradient of the function at the current point, we will get the local maxima of tat function.

This entire process is known as gradient descent, which is also known as steepest descent. The main objective of using a gradient descent algorithm is to minimize the cost function using iteration.

## Gradient descent Working

Before starting the working of gradient descent, we should know some basic concepts to find out the slope of a line from linear regression. The equation for the simple linear regression is given as:

Y = mx + c

Where ‘m’ represents the slope of the line, and ‘c’ represent the intercept on the y-axis.

• At the starting point, we will derive the first derivative or slope and then use a tangent line to calculate the steepness of this slope. Further, this slope will inform the updates to the parameters (weights and bias).
• The slope becomes steeper at the starting point or arbitrary point, but whenever new parameters are generated, then steepness gradually reduces, and at the lowest point, which is called a point of convergence.

## Role of Learning Rate

The main objective of gradient descent is to minimize the cost function or the error between expected and actual. To minimize the cost function, learning rate plays a crucial role.

Learning Rate is defined as the step size taken to reach the minima or lowest point. This is typically a small value that is evaluated and updated based on the behavior of the cost function.

If the learning rate is high, it results is large steps but also leads to risks of overshooting the minima. At the same time, a low learning rate shows the small step sizes, which compromises overall efficiency but gives the advantage of more precision.

## 70 thoughts on “Gradient Descent for Linear Regression”

1. Harry says:

Excellent article. I definitely appreciate this site. Keep writing!

1. Nomidl says:

Thank you Harry

2. online news says:

very good post, i certainly love this web site, keep on it

1. Nomidl says:

Thank you

3. Ty Durhan says:

Thank you for writing such an excellent article. It helped me a lot and I love the topic.

1. Nomidl says:

4. Rafael Gagnier says:

5. Lino Sauveur says:

Thank you for your post. I liked reading it because it addressed my issue. It helped me a lot and I hope it will help others too.

1. Nomidl says:

You’re welcome Lino

6. Drema Fredline says:

You really helped me by writing this article. I like the subject too.

1. Nomidl says:

Thank you Drema

7. Thanh Sondles says:

Thanks for posting such an excellent article. It helped me a lot and I love the subject matter.

1. Nomidl says:

8. Hollis Shumsky says:

I really enjoyed reading your post and it helped me a lot

1. Nomidl says:

Thank you Hollis.

9. Alysia Re says:

Your articles are extremely beneficial to me. May I request more information?

1. Nomidl says:

10. Jean Althaus says:

Thank you for posting this post. I found it extremely helpful because it explained what I was trying to say. I hope it can help others as well.

1. Nomidl says:

You’re welcome Jean

11. Antone Monton says:

Thank you for writing such a great article. It helped me a lot and I love the subject.

12. Ronny Pilling says:

Thank you for writing such an excellent article, it helped me out a lot and I love studying this topic.

13. Connie Zenisek says:

Thank you for posting this. I really enjoyed reading it, especially because it addressed my question. It helped me a lot and I hope it will help others too.

1. Nomidl says:

You’re welcome Connie

14. Bradford Kwang says:

Dear can you please write more on this? Your posts are always helpful to me. Thank you!

1. Nomidl says:

Thank you Bradford. yes sure will write more on this

15. Mickey Galardo says:

1. Nomidl says:

16. Shayne Samber says:

1. Nomidl says:

Sure will write more on this

17. Jeanette Respers says:

You’ve been a great help to me. Thank you!

1. Nomidl says:

18. Carmelo Lemos says:

Thank you for writing so many excellent articles. May I request more information on the subject?

1. Nomidl says:

You’re welcome Carmelo. yes sure let me know how can i help you?

19. Loni Audibert says:

Thank you for your help. I must say you’ve been really helpful to me.

1. Nomidl says:

20. Teena Premeaux says:

You’ve been great to me. Thank you!

1. Nomidl says:

21. Audrie Grannis says:

Thank you for posting such a wonderful article. It really helped me and I love the topic.

1. Nomidl says:

22. Hello my friend! I wish to say that this article is awesome, great written and come with almost all important infos. I?d like to look more posts like this .

1. Nomidl says:

Thank you Fabcouture

23. Kizzie Filgo says:

There is no doubt that your post was a big help to me. I really enjoyed reading it.

24. Columbus Coldwell says:

You helped me a lot. These articles are really helpful dude.

1. Nomidl says:

Thank you Columbus

25. Donald Nickols says:

Dude these articles have been really helpful to me. They really helped me out.

1. Nomidl says:

Thanks Donald

26. Willy Stambough says:

1. Nomidl says:

27. Brett Besherse says:

I’ve to say you’ve been really helpful to me. Thank you!

1. Nomidl says:

it’s my pleasure Brett.

28. Christian Lenling says:

How can I find out more about it?

1. Nomidl says:

Do you want more articles on Gradient descent?

29. Wen Mouritsen says:

I must say you’ve been a big help to me. Thanks!

1. Nomidl says:

Thank you Wen. I am glad that it helped you.
let me know what else do you want me to write.

30. Jena Klette says:

You’ve been very helpful to me. Thank you!

1. Nomidl says:

You’re welcome Jene

31. Zachery Lamie says:

You’ve been a great help to me. Thank you!

1. Nomidl says:

it’s my pleasure Zachery.

32. Tandy Fucillo says:

I’ve to say you’ve been really helpful to me. Thank you!

1. Nomidl says:

You’re welcome Tandy.

33. pornodom.top says:

Thank you ever so for you blog. Really looking forward to read more.

34. James Knights Instagram says:

You actually make it seem so easy with your presentation but I find this topic to be actually something that I think I would never understand. It seems too complicated and very broad for me. I’m looking forward for your next post, I?ll try to get the hang of it!

35. daget4d rtp says:

Excellent blog right here! Also your web site quite a bit up very fast! What host are you the usage of? Can I get your affiliate hyperlink to your host? I wish my website loaded up as fast as yours lol

36. Grupo de Rateios says:

Great post. I was checking constantly this blog and I’m impressed! Extremely helpful information particularly the last part 🙂 I care for such information much. I was looking for this particular info for a long time. Thank you and good luck.

As I site possessor I believe the content matter here is rattling fantastic , appreciate it for your hard work. You should keep it up forever! Best of luck.

It?s really a great and helpful piece of info. I?m glad that you shared this useful info with us. Please keep us up to date like this. Thanks for sharing.

39. curso de relacionamento interpessoal says:

whoah this blog is fantastic i love studying your posts. Stay up the great work! You already know, a lot of individuals are hunting around for this info, you can help them greatly.