12.4.5. Schur Decomposition of Symmetric Matrices

The Schur decomposition and eigendecomposition of a real, symmetric matrix have unique properties. We anticipate orthogonal eigenvectors of a symmetric matrix because the symmetry between the rows and columns of \(\bf{S}\) causes the matrix to remain symmetric as the Schur decomposition progresses. Because of the symmetry, the Schur decomposition yields \(\bf{T}\) as a diagonal matrix. So, the Schur decomposition of a symmetric matrix is the eigendecomposition, and the Schur vectors are also the eigenvectors. However, a more formal proof is available. We first establish that the eigenvalues of symmetric matrices are real, and then show that the eigenvectors are orthogonal.

Real Eigenvalues of a Symmetric Matrix

Theorem 12.3 (Real Eigenvalues of a Symmetric Matrix)

Let \(\bf{S}\) be a real, square, and symmetric matrix, \(\mathbf{S} \in \mathbb{R}^{n{\times}n}\), \(\mathbf{S} = \mathbf{S}^T\), then all eigenvalues and eigenvectors of \(\bf{S}\) are real.

Proof. Our proof allows that an eigenvalue \(\lambda\) and its eigenvector \(\bm{x}\) of a symmetric matrix \(\bf{S}\) might be complex with complex conjugates \(\bar{\lambda}\) and \(\bm{\bar{x}}\) and then shows that the eigenvector equation is only satisfied with real eigenvalues. The eigenvectors are real when the eigenvalues are real.

We start with an eigenvalue equation and the complex conjugate of the same eigenvalue equation. We then pre-multiply the first equation by the eigenvector’s transpose conjugate and the second equation by the transpose of the eigenvector. Finally, we subtract to see that the eigenvalues must be real.

\[\begin{split}\begin{array}{cccl} & \bm{\bar{x}}^T\,\mathbf{S}\,\bm{x} &= &\lambda\,\bm{\bar{x}}^T\,\bm{x} \\[1ex] - & \bm{x}^T\,\mathbf{S}\,\bm{\bar{x}} &= & \bar{\lambda}\,\bm{x}^T\,\bm{\bar{x}} \\[1mm] \hline \\[-5mm] & 0 &= & \left( \lambda - \bar{\lambda} \right)\, \bm{x}^T\,\bm{\bar{x}} \end{array}\end{split}\]

Because of the symmetry of \(\bf{S}\), the scalar values on the left-hand sides are the same (subtracting to zero). On the right-hand side, the dot product is the sum of the squares of the eigenvector \(\norm{\bm{x}}^2\) and can not be zero for a nonzero vector. Thus, it must be that \(\lambda - \bar{\lambda} = 0\), which is only true when \(\lambda\) is real.

\(\qed\)

Spectral Decomposition Theorem

Theorem 12.4 (Spectral Decomposition Theorem)

Let \(\bf{S}\) be a real, square, and symmetric matrix, \(\mathbf{S} \in \mathbb{R}^{n{\times}n}\), \(\mathbf{S} = \mathbf{S}^T\), then the eigenvectors are orthogonal to each other.

\[\mathbf{S} = \mathbf{Q\,\Lambda\,Q}^T, \quad \mathbf{Q\,Q}^T = \mathbf{I}\]

Proof. Recall that the commutative property of dot products allows the vectors to be reversed. Then, because of the symmetry of matrix \(\bf{S}\), we have the following equality relationship between any two eigenvectors, \(\bm{x}_i\) and \(\bm{x}_j\), and the symmetric matrix.

(12.6)\[\bm{x}_i^T\,\mathbf{S}\,\bm{x}_j = \bm{x}_j^T\,\mathbf{S}\,\bm{x}_i\]

We can apply the eigenvalue equation to each side of equation (12.6) and subtract the two equations.

\[\begin{split}\begin{array}{cccl} & \bm{x}_i^T\,\mathbf{S}\,\bm{x}_j &= &\lambda_j\,\bm{x}_i^T\,\bm{x}_j \\[1ex] - & \bm{x}_j^T\,\mathbf{S}\,\bm{x}_i &= &\lambda_i\,\bm{x}_j^T\,\bm{x}_i \\[1mm] \hline \\[-5mm] & 0 &= & \left( \lambda_j - \lambda_i \right)\, \bm{x}_i^T\,\bm{x}_j \end{array}\end{split}\]

Because the dot product between any two eigenvectors (\(i \neq j\)) of a symmetric matrix is zero, the eigenvectors must be orthogonal. Since orthogonal transformation matrices are used, the eigenvectors will be of unit length, so the eigenvector matrix, \(\bf{Q}\), is orthogonal because it is square and real.

\(\qed\)