Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more. 
Home Math Background Matrices Eigenvectors and Eigenvalues  Definition  
See also: PCA, matrix algebra, matrix inversion, eigenvectors  advanced stuff  
Eigenvectors and Eigenvalues  Definition
Ze = eλ with e (a vector) being the eigenvector, and λ (a scalar) the eigenvalue. This formal definition may look somewhat abstract, and its benefits are of course not evident from the equation above. However, eigenanalysis is important in various technical and scientific fields. In the context of data analysis, eigenvectors are used to obtain a more stable set of descriptors for viewing the data. In most cases eigenanalysis is based on some form of a scatter matrix Z=X^{T}X, with X being a matrix of n rows (observations) and p columns (variables). A square matrix of dimension p by p can have at most p eigenvectors. This can be denoted in matrix notation ZE = Ediag(λ_{1}, ... λ_{p}) The eigenvectors are orthogonal to each other, and the product E^{T}E is the identity matrix I (E is an orthonormal matrix). The matrix Z can be expressed by its eigenvectors Z = Ediag(λ_{1}, ... λ_{p})E^{T} Note that the inverse matrix of Z can be written simply as Z^{1} = Ediag(1/λ_{1}, ... 1/λ_{p})E^{T}


Home Math Background Matrices Eigenvectors and Eigenvalues  Definition 
Last Update: 20121008