12.3.3. The Schur Decomposition of Symmetric Matrices

The Schur decomposition and eigendecomposition of a real, symmetric matrix have some unique properties. For this reason, researchers working on algorithms related to the eigenvalue problem often use symmetric matrices for the initial testing of their algorithms. An algorithm must first find the Schur decomposition of a symmetric matrix before successfully operating on a non-symmetric matrix.

12.3.3.1. Real Eigenvalues of a Symmetric Matrix

Theorem 12.2 (Real Eigenvalues of a Symmetric Matrix)

Let \bf{S} be a real, square, and symmetric matrix, \mathbf{S} \in \mathbb{R}^{n{\times}n}, \mathbf{S} = \mathbf{S}^T, then all eigenvalues and eigenvectors of \bf{S} are real.

Proof. Our proof allows that an eigenvalue \lambda and its eigenvector \bm{x} of a symmetric matrix \bf{S} might be complex with complex conjugates \bar{\lambda} and \bm{\bar{x}} and then shows that the eigenvector equation is only satisfied with real eigenvalues. The eigenvectors are real when the eigenvalues are real.

We start with an eigenvalue equation and the complex conjugate of the same eigenvalue equation. We then pre-multiply the first equation by the eigenvector’s transpose conjugate and the second equation by the transpose of the eigenvector. Finally, we subtract to see that the eigenvalues must be real.

\begin{array}{cccl}
    & \bm{\bar{x}}^T\,\mathbf{S}\,\bm{x} &=
                &\lambda\,\bm{\bar{x}}^T\,\bm{x} \\[1ex]
- & \bm{x}^T\,\mathbf{S}\,\bm{\bar{x}} &=
            & \bar{\lambda}\,\bm{x}^T\,\bm{\bar{x}} \\[1mm]
        \cline{1-4} \\[-3mm]
    &  0 &= & \left( \lambda - \bar{\lambda} \right)\,
                \bm{x}^T\,\bm{\bar{x}}
\end{array}

Because of the symmetry of \bf{S}, the scalar values on the left-hand sides are the same (subtracting to zero). On the right-hand side, the dot product is the sum of the squares of the eigenvector \norm{\bm{x}}^2 and can not be zero for a nonzero vector. Thus, it must be that \lambda - \bar{\lambda} = 0, which is only true when \lambda is real.

\qed

12.3.3.2. Spectral Decomposition Theorem

We anticipate orthogonal eigenvectors of a symmetric matrix because the symmetry between the rows and columns of \bf{S} causes the matrix to remain symmetric as the Schur decomposition progresses. Because of the symmetry, the Schur decomposition yields \bf{T} as a diagonal matrix. So, the Schur decomposition of a symmetric matrix is the eigendecomposition, and the Schur vectors are also the eigenvectors. However, a more formal proof is available.

Theorem 12.3 (Spectral Decomposition Theorem)

Let \bf{S} be a real, square, and symmetric matrix, \mathbf{S} \in \mathbb{R}^{n{\times}n}, \mathbf{S} = \mathbf{S}^T, then the eigenvectors are orthogonal to each other.

\mathbf{S} = \mathbf{Q\,\Lambda\,Q}^T, \quad
        \mathbf{Q\,Q}^T = \mathbf{I}

Proof. Recall that the commutative property of dot products allows the vectors to be reversed. Then, because of the symmetry of matrix \bf{S}, we have the following equality relationship between any two eigenvectors and the symmetric matrix.

\bm{x}_i^T\,\mathbf{S}\,\bm{x}_j = \bm{x}_j^T\,\mathbf{S}\,\bm{x}_i

Starting with two eigenvector equations, we can pre-multiply the first equation by the transpose of the eigenvector from the second equation, then pre-multiply the second equation by the transpose of the eigenvector from the first equation and subtract the two equations.

\begin{array}{cccl}
    & \bm{x}_j^T\,\mathbf{S}\,\bm{x}_i &=
                    &\lambda_i\,\bm{x}_j^T\,\bm{x}_i \\[1ex]
    - & \bm{x}_i^T\,\mathbf{S}\,\bm{x}_j &=
                    &\lambda_j\,\bm{x}_i^T\,\bm{x}_j \\[1mm]
        \cline{1-4} \\[-3mm]
    &  0 &= & \left( \lambda_i - \lambda_j \right)\, \bm{x}_i^T\,\bm{x}_j
\end{array}

Because the dot product between any two eigenvectors (i \neq j) of a symmetric matrix is zero, the eigenvectors must be orthogonal. Owing to the use of unitary matrices, the eigenvectors will be unit length, so the eigenvector matrix, \bf{Q} is orthonormal and orthogonal because it is square and real.

\qed