13.4. Linearly Independent Eigenvectors¶
When the eigenvalues of a matrix are distinct (unique), the eigenvectors form a set of Linearly Independent Vectors. Here we show a proof that when a matrix has distinct eigenvalues then the eigenvectors are linearly independent. We will first show that pairs of eigenvectors are linearly independent, and then we will extend the proof to show general linear independence of all of the eigenvectors.
There are shorter proofs of independent eigenvectors, but this proof seems simpler to follow than the shorter proofs.
13.4.1. Pairwise Independence¶
Let and be distinct eigenvalues of , with corresponding eigenvectors and . To prove that and are linearly independent, we need to show that if
then it must be that .
Make a copy of the above equation and multiply one equation on the left by and multiply the other equation on the left by .
Now subtract.
Since the ’s are different and , we conclude that , and similarly . Thus only when . So the eigenvectors and are linearly independent of each other.
13.4.2. General Independence¶
Given that the eigenvalue, eigenvector pairs () and () are independent, there can not exist a third pair () such that or , for any scalar . We have only to show for general independence that any third eigenvector can not be a linear combination of other eigenvectors.
This is a proof by contradiction, so we begin by considering an assertion that we will prove to be false.
If eigenvector is a linear combination of and , then there must exist constants and such that
(13.1)¶
Since ,
Vectors and can be substituted for and .
(13.2)¶
Pre-multiplying both sides of (13.2) by removes the matrices leaving (13.1) and (13.2) as equivalent equations. If (13.1) and (13.2) are true, then it must be that and , so , which is a contradiction of the initial statement that each eigenvalue is distinct. Therefore, (13.1) is false. If each eigenvalue is distinct, then all eigenvectors are linearly independent.