12.3.3. The Schur Decomposition of Symmetric Matrices¶
The Schur decomposition and eigendecomposition of a real, symmetric matrix have some unique properties. For this reason, researchers working on algorithms related to the eigenvalue problem often use symmetric matrices for the initial testing of their algorithms. An algorithm must first find the Schur decomposition of a symmetric matrix before successfully operating on a non-symmetric matrix.
12.3.3.1. Real Eigenvalues of a Symmetric Matrix¶
Theorem 12.2 (Real Eigenvalues of a Symmetric Matrix)
Letbe a real, square, and symmetric matrix,
,
, then all eigenvalues and eigenvectors of
are real.
Proof. Our proof allows that an eigenvalue and its eigenvector
of a symmetric matrix
might be complex
with complex conjugates
and
and then shows that the eigenvector equation is only satisfied with real
eigenvalues. The eigenvectors are real when the eigenvalues are real.
We start with an eigenvalue equation and the complex conjugate of the same eigenvalue equation. We then pre-multiply the first equation by the eigenvector’s transpose conjugate and the second equation by the transpose of the eigenvector. Finally, we subtract to see that the eigenvalues must be real.
Because of the symmetry of , the scalar values on the
left-hand sides are the same (subtracting to zero). On the right-hand
side, the dot product is the sum of the squares of the eigenvector
and can not be zero for a nonzero vector. Thus,
it must be that
, which is only true
when
is real.
12.3.3.2. Spectral Decomposition Theorem¶
We anticipate orthogonal eigenvectors of a symmetric matrix because the
symmetry between the rows and columns of causes the
matrix to remain symmetric as the Schur decomposition progresses.
Because of the symmetry, the Schur decomposition yields
as a diagonal matrix. So, the Schur decomposition of a symmetric matrix
is the eigendecomposition, and the Schur vectors are also the
eigenvectors. However, a more formal proof is available.
Theorem 12.3 (Spectral Decomposition Theorem)
Let
be a real, square, and symmetric matrix,
,
, then the eigenvectors are orthogonal to each other.
Proof. Recall that the commutative property of dot products allows the vectors
to be reversed. Then, because of the symmetry of matrix ,
we have the following equality relationship between any two eigenvectors
and the symmetric matrix.
Starting with two eigenvector equations, we can pre-multiply the first equation by the transpose of the eigenvector from the second equation, then pre-multiply the second equation by the transpose of the eigenvector from the first equation and subtract the two equations.
Because the dot product between any two eigenvectors ()
of a symmetric matrix is zero, the eigenvectors must be orthogonal.
Owing to the use of unitary matrices, the eigenvectors will be unit
length, so the eigenvector matrix,
is orthonormal and
orthogonal because it is square and real.