.. _IndepEigs: Linearly Independent Eigenvectors Theorem ========================================= .. _th-indepEigs: .. rubric:: Linearly Independent Eigenvectors .. prf:theorem:: Linearly Independent Eigenvectors :label: th-indepEigs When the eigenvalues of a matrix are distinct (unique), the eigenvectors form a set of linearly independent vectors (:ref:`linIndepVectors`). .. prf:proof:: We will first show that pairs of eigenvectors are linearly independent, and then the proof is extended to show general linear independence of all of the eigenvectors. **Pairwise Independence** Let :math:`\lambda_1` and :math:`\lambda_2` be distinct eigenvalues of :math:`\bf{A}`, with corresponding eigenvectors :math:`\bm{x}_1` and :math:`\bm{x}_2`. To prove that :math:`\bm{x}_1` and :math:`\bm{x}_2` are linearly independent, we need to show that if .. math:: c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bf{0}, then it must be that :math:`c_1 = c_2 = 0`. Make two copies of the above equation, multiply one equation on the left side by :math:`\bf{A}`, and multiply the other equation on the left by :math:`\lambda_2`. .. math:: \begin{array}{cl} &\mathbf{A}\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) = c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\ \hfill \\ &\lambda_2\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) = c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \end{array} Now subtract. .. math:: \begin{array}{cl} &c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1ex] - &c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1mm] \hline &c_1\,(\lambda_1 - \lambda_2)\,\bm{x}_1 = \bm{0} \\[1mm] \end{array} Since the :math:`\lambda`\ ’s are different and :math:`\bm{x}_1 \neq \bm{0}`, we conclude that :math:`c_1 = 0`, and similarly :math:`c_2 = 0`. Thus, :math:`c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bm{0}` implies :math:`c_1 = c_2 = 0`. So the eigenvectors :math:`\bm{x}_1` and :math:`\bm{x}_2` are linearly independent of each other. **General Independence** Given that the eigenpairs (:math:`\lambda_1, \bm{x}_1`) and (:math:`\lambda_2, \bm{x}_2`) are independent, there can not exist a third eigenpair (:math:`\lambda_3, \bm{x}_3`) such that :math:`\bm{x}_3 = k\,\bm{x}_1` or :math:`\bm{x}_3 = k\,\bm{x}_2`, for any scalar :math:`k`. To prove general independence, we have only to show that any third eigenvector can not be a linear combination of the other eigenvectors. This is a proof by contradiction, so we begin by considering an assertion that we will prove false. If eigenvector :math:`\bm{x}_3` is a linear combination of :math:`\bm{x}_1` and :math:`\bm{x}_2`, then there must exist constants :math:`k_1` and :math:`k_2` such that .. math:: :label: eq-ind-eig1 \bm{x}_3 = k_1\,\bm{x}_1 + k_2\,\bm{x}_2. Since :math:`\mathbf{A}\,\bm{x}_3 = \lambda_3\,\bm{x}_3`, .. math:: \mathbf{A}\,\bm{x}_3 = \lambda_3 (k_1\,\bm{x}_1 + k_2\,\bm{x}_2) = \lambda_3\,k_1\,\bm{x}_1 + \lambda_3\,k_2\,\bm{x}_2. Vectors :math:`\bm{x}_1` and :math:`\bm{x}_2` can be substituted for :math:`\bm{x}_1 = \frac{1}{\lambda_1}\mathbf{A}\,\bm{x}_1` and :math:`\bm{x}_2 = \frac{1}{\lambda_2}\mathbf{A}\,\bm{x}_2`. .. math:: :label: eq-ind-eig2 \mathbf{A}\,\bm{x}_3 = \frac{\lambda_3\,k_1}{\lambda_1}\mathbf{A}\,\bm{x}_1 + \frac{\lambda_3\,k_2}{\lambda_2}\mathbf{A}\,\bm{x}_2 .. math:: :label: eq-ind-eig3 \bm{x}_3 = \frac{\lambda_3\,k_1}{\lambda_1}\bm{x}_1 + \frac{\lambda_3\,k_2}{\lambda_2}\bm{x}_2 Pre-multiplying both sides of equation :eq:`eq-ind-eig2` by :math:`\bf{A}^{-1}` removes the :math:`\bf{A}` matrices leaving equation :eq:`eq-ind-eig1` and equation :eq:`eq-ind-eig3` as equivalent equations. If equation :eq:`eq-ind-eig1` and equation :eq:`eq-ind-eig3` are true, then it must be that :math:`\lambda_3 = \lambda_1` and :math:`\lambda_3 = \lambda_2`, so :math:`\lambda_1 = \lambda_2 = \lambda_3`, which is a contradiction of the initial statement that each eigenvalue is distinct. Therefore, equation :eq:`eq-ind-eig1` is false. If each eigenvalue is distinct, then all eigenvectors are linearly independent. :math:`\qed` .. index:: independent eigenvectors theorem