Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and here for more.

Error Propagation

What happens if a process under investigation is influenced not only by a single but by several sources of random errors which contribute to the measured signal? Mathematically speaking this can be formulated as follows: let's assume, for example, that the measured signal y is a function of three variables a,b, and c.

y = f(a,b,c)

The resulting overall error (which is specified by the variance sy2 of the signal) consists of three parts, each of which is proportional to the square of the partial derivative and the variance the particular variable. Thus the contributions to the total error of the signal y (assuming that y is a linear function of the independent variables a, b, and c) add up according to the following identity:

In general, the variance of a combined signal sy2 is equal to the sum of the variances of the individual contributions times the square of the partial derivative of that contribution.

In practical applications the law of error propagation exhibits considerable restrictions: in order to be useful it is mandatory that one knows the error amplitudes of the individual signal contributions - which is rarely fulfilled. Thus the total error cannot be estimated in this way. Further, experience shows that the estimation of the individual variances is quite often wrong, consequently resulting in a wrong estimation of the total error.