Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and here for more.

Regression - Straight Line

For  a particular value Xi of the independent variable, we can find the predicted value i by using the equation of a straight line:

i = a + bXi

The difference between i and Yi is called the residual εi and represents the error which is made when predicting Yi as the response to variable X. The best fit for all available points Xi can be found by minimizing the sum of all squared εi. In fact, the criterium for minimizing the error could be any other suitable function. However the sum of squares has certain mathematical advantages. A more detailed discussion on the mathematics behind the regression can be found elsewhere.

Please note that the linear regression  is based on several assumptions, which have to be fulfilled when applying regression methods to the data.