{"id":871,"date":"2018-09-17T14:51:45","date_gmt":"2018-09-17T14:51:45","guid":{"rendered":"http:\/\/muthu.co\/?p=871"},"modified":"2021-05-24T03:01:17","modified_gmt":"2021-05-24T03:01:17","slug":"linear-regression-using-gradient-descent-algorithm","status":"publish","type":"post","link":"http:\/\/write.muthu.co\/linear-regression-using-gradient-descent-algorithm\/","title":{"rendered":"Linear Regression using Gradient Descent Algorithm"},"content":{"rendered":"\n
Gradient descent is an optimization method used to find the minimum value of a function by iteratively updating the parameters of the function. Parameters refer to coefficients in Linear Regression and weights in Neural Networks.<\/p>\n\n\n\n
In a linear regression problem<\/a>, we find a modal that gives an approximate representation of our dataset. In the below image, the red dots denote out dataset, the blue line is what we find using linear regression and the euclidean distance between the red dots and blue line is what we call the cost or the error. You can understand more about linear regression from my previous<\/a> post.<\/p>\n\n\n\n