Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more. 
Home Bivariate Data Regression Derivation of a Univariate Regression Formula  
See also: regression, Curvilinear Regression, Regression  Confidence Interval, Regression after Linearisation  
Univariate Regression  Derivation of Equations
Let us conduct this procedure for a particular example: This formula is to be estimated from a series of data points [x_{i},y_{i}], where the x_{i} are the independent values, and the y_{i} are to be estimated. By substituting the y_{i} values with their estimates ax_{i}+bx_{i}^{2} we obtain the following series of data points: [xi, ax_{i}+bx_{i}^{2}]. The actual values of the y values are, however, the y_{i}. Thus the sum of squared errors S for n data points is defined by S = (ax_{1}+bx_{1}^{2}y_{1})^{2} + (ax_{2}+bx_{2}^{2}y_{2})^{2} + (ax_{3}+bx_{3}^{2}y_{3})^{2} + ...... + (ax_{n}+bx_{n}^{2}y_{n})^{2} Now we have to calculate the partial derivatives with respect to the parameters a and b, and equate them to zero: dS/da = 0 = 2(ax_{1}+bx_{1}^{2}y_{1})x_{1
}+
2(ax_{2}+bx_{2}^{2}y_{2})x_{2}
+ 2(ax_{3}+bx_{3}^{2}y_{3})x_{3}
+ ...... + 2(ax_{n}+bx_{n}^{2}y_{n})x_{n}
These two equations can easily be reduced by introducing the sums of the individual terms: Now, solve these equations for the coefficients a and b: And then substitute the expressions for a and b into their counterparts, with the following final results:


Home Bivariate Data Regression Derivation of a Univariate Regression Formula 
Last Update: 20121008