Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and here for more.

Ridge Regression

As already noted, one of the major problems of MLR is the fact that the estimated regression coefficients become more unstable when the correlation among the explanatory variables increases (see multi-collinearity).

A possible remedy can be found in using "ridge regression" which stabilizes the regression coefficients by a simple mathematical trick. However, this stabilization leads to biased estimates of the coefficients. Furthermore, there is no general solution for all kinds of problems. Thus the ridge regression model has to be adapted from case to case by adjusting a model parameter (λ).

In order to find the best suited regression model the ridge traces (i.e. the coefficients of the regression model and the goodness of fit) are plotted against the model parameter λ. While in theory, the parameter λ may vary between 0.0 and 1.0, it is most often sufficient for practical applications to scan the range between 0.0 and 0.1. The stabilisation of the regression coefficients usually occurs at very low levels of λ (typically around λ ~ 0.01). The selection of λ is guided by the compromise of preferrably a maximum goodness of fit and stable regression coefficients.