{"id":677,"date":"2018-06-17T06:11:52","date_gmt":"2018-06-17T06:11:52","guid":{"rendered":"http:\/\/muthu.co\/?p=677"},"modified":"2021-05-24T03:37:52","modified_gmt":"2021-05-24T03:37:52","slug":"maths-behind-polynomial-regression","status":"publish","type":"post","link":"http:\/\/write.muthu.co\/maths-behind-polynomial-regression\/","title":{"rendered":"Maths behind Polynomial regression"},"content":{"rendered":"\n
Polynomial regression is a process of finding a polynomial function that takes the form f<\/i>( x<\/i> ) = c<\/i>0<\/sub> + c<\/i>1 <\/sub>x<\/i> + c<\/i>2<\/sub> x<\/i>2<\/sup> \u22ef c<\/i>n<\/i><\/sub> <\/i>x<\/i>n<\/i><\/sup> where n<\/i> is the degree of the polynomial and c<\/i> is a set of coefficients. Through polynomial regression we try to find an nth degree polynomial function which is the closest approximation of our data points. Below is a sample random dataset which has been regressed upto 3 degree and plotted on a graph. The blue dots represent our data set and the lines represent our polynomial functions of different degrees. The higher the degree, closer is the approximation but higher doesn’t always mean right which we will discuss in later articles.<\/p>\n\n\n\ny_train=[[100],[110],[124],[142],[159],[161],[170],[173],[179],[180],[217],[228],[230],[284],[300],[330],[360],[392],[414],[435],[451],[476],[499],[515],[543],[564]]<\/pre>\n\n\n\n