Simple Linear Regression:
Ideally the regression equation should look like this:
Predicted/Fitted Value:
is the error term in the regression equation & is calculated as
OLS( Ordinary Least Squares Regression ):
The residual sum of squares is:
The method of minimizing the sum of squared residuals is termed as Ordinary Least Squared Regression.
Multiple linear Regression:
When there are multiple predictors, the equation is simply extended to accommodate
them:
R-Squared:
This value represents proportion of variation explained by the model & takes value between 0 & 1. Formula is:
t-statistic & p-value:
t-statistic has an inverse relation with p-value. p-value measures the extent to which a coefficient is significant. So the higher the t-statistic is & hence lower the p-value is, the more significant the predictor.
link p value concept here.............
Nonlinear Regression:
Its when the relation between the response(predicted value) & the predictor value is not linear. We can introduce a higher order polynomial term in the regression equation to make it nonlinear.
Sometimes introducing a higher order term into the regression makes it more wiggly. In that case we specify knots & introduce something called a "Spline". It will then be called a spline regression.
Sometimes that also becomes challenging as we may not know where to introduce the knot. In such conditions we may need to use GAM (Generalized Additive Models). This helps smoothen the curve & captures the relations perfectly. The mathematics behind this is beyond the scope of this article.
0 Comments
BRING THIS DEAD POST TO LIFE