In matrix theory, an element in the underlying ring R of a square matrix A is called a right eigenvalue if there exists a nonzero column vector x such that Ax=λx, or a left eigenvalue if the exists a nonzero row vector y such that yA=yλ. If R is commutative, the left eigenvalues of A are exactly the right eigenvalues of A and are just called eigenvalues. If R is not commutative, e.g. quaternions, they may be different.
Table of contents |
2 Spectrum 3 Multiset of eigenvalues 4 Trace and Determinant 5 See also |
Multiplicity
Suppose A is a square matrix over commutative ring. The algebraic multiplicity (or simply multiplicity) of an eigenvalue λ of A is the number of factor t-λ of the characteristic polynomial of A. The geometric multiplicity of λ is the number of factor t-λ of the minimal polynomial of A or equivalently the nullity of (λI-A).
An eigenvalue of algebraic multiplicity 1 is called a simple eigenvalue.
Spectrum
In functional analysis, a spectrum of a linear operator A is the set of scalar ν such that νI-A is not invertible. If the underlying Hilbert space is of finite dimensional, then the spectrum of A is the same of the set of eigenvalues of A.
This style is used because algebraic multiplicity is the key to many mathematical proofs in matrix theory.
Multiset of eigenvalues
Occasionally, in an article on matrix theory, one may read a statement like:
It means the algebraic multiplicity of 4 is two, of 3 is three, of 2 is two and of 1 is one.Trace and Determinant
Suppose the eigenvalues of a matrix A are λ1,λ2,...,λn. Then the trace of A is λ1+λ2+...+λn and the determinant of A is λ1λ2λn. These two are very important concepts in matrix theory.See also
Please refer to eigenvector for some other properties of eigenvalues.\n