Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more.


Eigenvectors and Eigenvalues - Definition

Eigenvectors and eigenvalues are defined only for square matrices Z and obey the following equation:

Ze = eλ

with e (a vector) being the eigenvector, and λ (a scalar) the eigenvalue.

This formal definition may look somewhat abstract, and its benefits are of course not evident from the equation above. However, eigenanalysis is important in various technical and scientific fields. In the context of data analysis, eigenvectors are used to obtain a more stable set of descriptors for viewing the data. In most cases eigenanalysis is based on some form of a scatter matrix Z=XTX, with X being a matrix of n rows (observations) and p columns (variables).

A square matrix of dimension p by p can have at most p eigenvectors. This can be denoted in matrix notation

ZE = Ediag(λ1, ... λp)

The eigenvectors are orthogonal to each other, and the product ETE is the identity matrix I (E is an orthonormal matrix). The matrix Z can be expressed by its eigenvectors

Z = Ediag(λ1, ... λp)ET

Note that the inverse matrix of Z can be written simply as

Z-1 = Ediag(1/λ1, ... 1/λp)ET