13.4. Linearly Independent Eigenvectors Theorem

Linearly Independent Eigenvectors

Theorem 13.1 (Linearly Independent Eigenvectors)

When the eigenvalues of a matrix are distinct (unique), the eigenvectors form a set of linearly independent vectors (Vector Spaces).

Proof. We will first show that pairs of eigenvectors are linearly independent, and then the proof is extended to show general linear independence of all of the eigenvectors.

Pairwise Independence

Let \(\lambda_1\) and \(\lambda_2\) be distinct eigenvalues of \(\bf{A}\), with corresponding eigenvectors \(\bm{x}_1\) and \(\bm{x}_2\). To prove that \(\bm{x}_1\) and \(\bm{x}_2\) are linearly independent, we need to show that if

\[c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bf{0},\]

then it must be that \(c_1 = c_2 = 0\). Make two copies of the above equation, multiply one equation on the left side by \(\bf{A}\), and multiply the other equation on the left by \(\lambda_2\).

\[\begin{split}\begin{array}{cl} &\mathbf{A}\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) = c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\ \hfill \\ &\lambda_2\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) = c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \end{array}\end{split}\]

Now subtract.

\[\begin{split}\begin{array}{cl} &c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1ex] - &c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1mm] \hline &c_1\,(\lambda_1 - \lambda_2)\,\bm{x}_1 = \bm{0} \\[1mm] \end{array}\end{split}\]

Since the \(\lambda\)’s are different and \(\bm{x}_1 \neq \bm{0}\), we conclude that \(c_1 = 0\), and similarly \(c_2 = 0\). Thus, \(c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bm{0}\) implies \(c_1 = c_2 = 0\). So the eigenvectors \(\bm{x}_1\) and \(\bm{x}_2\) are linearly independent of each other.

General Independence

Given that the eigenpairs (\(\lambda_1, \bm{x}_1\)) and (\(\lambda_2, \bm{x}_2\)) are independent, there can not exist a third eigenpair (\(\lambda_3, \bm{x}_3\)) such that \(\bm{x}_3 = k\,\bm{x}_1\) or \(\bm{x}_3 = k\,\bm{x}_2\), for any scalar \(k\). To prove general independence, we have only to show that any third eigenvector can not be a linear combination of the other eigenvectors.

This is a proof by contradiction, so we begin by considering an assertion that we will prove false.

If eigenvector \(\bm{x}_3\) is a linear combination of \(\bm{x}_1\) and \(\bm{x}_2\), then there must exist constants \(k_1\) and \(k_2\) such that

(13.1)\[\bm{x}_3 = k_1\,\bm{x}_1 + k_2\,\bm{x}_2.\]

Since \(\mathbf{A}\,\bm{x}_3 = \lambda_3\,\bm{x}_3\),

\[\mathbf{A}\,\bm{x}_3 = \lambda_3 (k_1\,\bm{x}_1 + k_2\,\bm{x}_2) = \lambda_3\,k_1\,\bm{x}_1 + \lambda_3\,k_2\,\bm{x}_2.\]

Vectors \(\bm{x}_1\) and \(\bm{x}_2\) can be substituted for \(\bm{x}_1 = \frac{1}{\lambda_1}\mathbf{A}\,\bm{x}_1\) and \(\bm{x}_2 = \frac{1}{\lambda_2}\mathbf{A}\,\bm{x}_2\).

(13.2)\[\mathbf{A}\,\bm{x}_3 = \frac{\lambda_3\,k_1}{\lambda_1}\mathbf{A}\,\bm{x}_1 + \frac{\lambda_3\,k_2}{\lambda_2}\mathbf{A}\,\bm{x}_2\]
(13.3)\[\bm{x}_3 = \frac{\lambda_3\,k_1}{\lambda_1}\bm{x}_1 + \frac{\lambda_3\,k_2}{\lambda_2}\bm{x}_2\]

Pre-multiplying both sides of equation (13.2) by \(\bf{A}^{-1}\) removes the \(\bf{A}\) matrices leaving equation (13.1) and equation (13.3) as equivalent equations. If equation (13.1) and equation (13.3) are true, then it must be that \(\lambda_3 = \lambda_1\) and \(\lambda_3 = \lambda_2\), so \(\lambda_1 = \lambda_2 = \lambda_3\), which is a contradiction of the initial statement that each eigenvalue is distinct. Therefore, equation (13.1) is false. If each eigenvalue is distinct, then all eigenvectors are linearly independent.

\(\qed\)