13.4. Linearly Independent Eigenvectors

When the eigenvalues of a matrix are distinct (unique), the eigenvectors form a set of Linearly Independent Vectors. Here we show a proof that when a matrix has distinct eigenvalues then the eigenvectors are linearly independent. We will first show that pairs of eigenvectors are linearly independent, and then we will extend the proof to show general linear independence of all of the eigenvectors.

There are shorter proofs of independent eigenvectors, but this proof seems simpler to follow than the shorter proofs.

13.4.1. Pairwise Independence

Let \lambda_1 and \lambda_2 be distinct eigenvalues of \bf{A}, with corresponding eigenvectors \bm{x}_1 and \bm{x}_2. To prove that \bm{x}_1 and \bm{x}_2 are linearly independent, we need to show that if

c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bf{0},

then it must be that c_1 = c_2 = 0.

Make a copy of the above equation and multiply one equation on the left by \bf{A} and multiply the other equation on the left by \lambda_2.

\begin{array}{cl}
&\mathbf{A}\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) =
c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\
\hfill \\
&\lambda_2\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) =
c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0}
\end{array}

Now subtract.

\begin{array}{cl}
  &c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1ex]
- &c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1mm]
\cline{1-2} \\[-3mm]
  &c_1\,(\lambda_1 - \lambda_2)\,\bm{x}_1 = \bm{0} \\[1mm]
  \end{array}

Since the \lambda’s are different and \bm{x}_1 \neq \bm{0}, we conclude that c_1 = 0, and similarly c_2 = 0. Thus c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bm{0} only when c_1 = c_2 = 0. So the eigenvectors \bm{x}_1 and \bm{x}_2 are linearly independent of each other.

13.4.2. General Independence

Given that the eigenvalue, eigenvector pairs (\lambda_1, \bm{x}_1) and (\lambda_2, \bm{x}_2) are independent, there can not exist a third pair (\lambda_3, \bm{x}_3) such that \bm{x}_3 = k\,\bm{x}_1 or \bm{x}_3 = k\,\bm{x}_2, for any scalar k. We have only to show for general independence that any third eigenvector can not be a linear combination of other eigenvectors.

This is a proof by contradiction, so we begin by considering an assertion that we will prove to be false.

If eigenvector \bm{x}_3 is a linear combination of \bm{x}_1 and \bm{x}_2, then there must exist constants k_1 and k_2 such that

(13.1)\bm{x}_3 = k_1\,\bm{x}_1 + k_2\,\bm{x}_2.

Since \mathbf{A}\,\bm{x}_3 = \lambda_3\,\bm{x}_3,

\mathbf{A}\,\bm{x}_3 = \lambda_3 (k_1\,\bm{x}_1 + k_2\,\bm{x}_2) =
\lambda_3\,k_1\,\bm{x}_1 + \lambda_3\,k_2\,\bm{x}_2.

Vectors \bm{x}_1 and \bm{x}_2 can be substituted for \bm{x}_1 = \frac{1}{\lambda_1}\mathbf{A}\,\bm{x}_1 and \bm{x}_2 = \frac{1}{\lambda_2}\mathbf{A}\,\bm{x}_2.

(13.2)\mathbf{A}\,\bm{x}_3 =
        \frac{\lambda_3\,k_1}{\lambda_1}\mathbf{A}\,\bm{x}_1 +
        \frac{\lambda_3\,k_2}{\lambda_2}\mathbf{A}\,\bm{x}_2

Pre-multiplying both sides of (13.2) by \bf{A}^{-1} removes the \bf{A} matrices leaving (13.1) and (13.2) as equivalent equations. If (13.1) and (13.2) are true, then it must be that \lambda_3 = \lambda_1 and \lambda_3 = \lambda_2, so \lambda_1 = \lambda_2 = \lambda_3, which is a contradiction of the initial statement that each eigenvalue is distinct. Therefore, (13.1) is false. If each eigenvalue is distinct, then all eigenvectors are linearly independent.