Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more.


Linear Equations

One of the big advantages of matrix algebra is that systems of linear equations can be depicted as matrices. So most of the operations valid for matrices are also valid for the corresponding system of linear equations. This is quite important for multivariate statistics because many methods of multivariate statistics are based on solving systems of (linear) equations.

Look at the following example of a system of linear equations:
 

3x1 + 5x2 - x3 = 12
2x1     + x3 = 5
x1 - 5x2 + 3x3 = 0

These equations can be denoted in matrix form as follows:
 

[ 3
2
1
5
0
-5
-1
1
3
] [ x1
x2
x3
]  =  [ 12
5
0
]

You see that the left sides of the equations have been decomposed into a product of the matrix of the coefficients and the unknown variables x1, x2, and x3. This equation can be written in matrix notation as

A x = s,

with A being the matrix of coefficients, x being the vector of unknowns, and s being the constant vector at the right side of the equation system.