Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more. 
Home Bivariate Data Regression Straight Line  
See also: general approach, assumptions, Residuals, Coefficient of Variation, Regression  Confidence Interval, Regression  Introduction  
Regression  Straight Line
For a particular value X_{i} of the independent variable, we can find the predicted value _{i} by using the equation of a straight line: _{i} = a + bX_{i} The difference between _{i}
and Y_{i} is called the residual
ε_{i}
and represents the error which is made when predicting Y_{i} as
the response to variable X. The best fit for all available points X_{i}
can be found by minimizing the sum of all squared ε_{i}.
In fact, the criterium for minimizing the error could be any other suitable
function. However the sum of squares has certain mathematical advantages.
A more detailed discussion on the mathematics behind the regression can
be found elsewhere.
Please note that the linear regression is based on several assumptions,
which have to be fulfilled when applying regression methods to the data.


Home Bivariate Data Regression Straight Line 
Last Update: 20121008