Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and here for more.

Multiple Linear Regression - Introduction

Multiple linear regression (MLR) is similar to simple linear regression, the only difference being the use of more than one input variable. In order to calculate the relationship between n input variables xi and the target variable y we could use the linear equation

y = a0 + a1x1 + a2x2 + ... + anxn + ε


The parameter e defines the error, or the residual,  with a mean of zero. This equation defines a hyperplane in n-dimensional space. The parameters of this plane have to be adjusted so that the plane optimally fits the data. In order to obtain the best fit, the parameters a0 to an are adjusted such that the sum of the squared errors is minimized. The assumptions are the same as for simple regression. The estimated parameters can again be discussed by using the ANOVA table.