.. _IndepEigs: Linearly Independent Eigenvectors ===================================== .. index:: independent eigenvectors When the eigenvalues of a matrix are distinct (unique), the eigenvectors form a set of :ref:`linIndepVectors`. Here we show a proof that when a matrix has distinct eigenvalues then the eigenvectors are linearly independent. We will first show that pairs of eigenvectors are linearly independent, and then we will extend the proof to show general linear independence of all of the eigenvectors. There are shorter proofs of independent eigenvectors, but this proof seems simpler to follow than the shorter proofs. Pairwise Independence ^^^^^^^^^^^^^^^^^^^^^^ Let :math:`\lambda_1` and :math:`\lambda_2` be distinct eigenvalues of :math:`\bf{A}`, with corresponding eigenvectors :math:`\bm{x}_1` and :math:`\bm{x}_2`. To prove that :math:`\bm{x}_1` and :math:`\bm{x}_2` are linearly independent, we need to show that if .. math:: c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bf{0}, then it must be that :math:`c_1 = c_2 = 0`. Make a copy of the above equation and multiply one equation on the left by :math:`\bf{A}` and multiply the other equation on the left by :math:`\lambda_2`. .. math:: \begin{array}{cl} &\mathbf{A}\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) = c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\ \hfill \\ &\lambda_2\,(c_1\,\bm{x}_1 + c_2\,\bm{x}_2) = c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \end{array} Now subtract. .. math:: \begin{array}{cl} &c_1\,\lambda_1\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1ex] - &c_1\,\lambda_2\,\bm{x}_1 + c_2\,\lambda_2\,\bm{x}_2 = \bm{0} \\[1mm] \cline{1-2} \\[-3mm] &c_1\,(\lambda_1 - \lambda_2)\,\bm{x}_1 = \bm{0} \\[1mm] \end{array} Since the :math:`\lambda`\ 's are different and :math:`\bm{x}_1 \neq \bm{0}`, we conclude that :math:`c_1 = 0`, and similarly :math:`c_2 = 0`. Thus :math:`c_1\,\bm{x}_1 + c_2\,\bm{x}_2 = \bm{0}` only when :math:`c_1 = c_2 = 0`. So the eigenvectors :math:`\bm{x}_1` and :math:`\bm{x}_2` are linearly independent of each other. General Independence ^^^^^^^^^^^^^^^^^^^^^^ Given that the eigenvalue, eigenvector pairs (:math:`\lambda_1, \bm{x}_1`) and (:math:`\lambda_2, \bm{x}_2`) are independent, there can not exist a third pair (:math:`\lambda_3, \bm{x}_3`) such that :math:`\bm{x}_3 = k\,\bm{x}_1` or :math:`\bm{x}_3 = k\,\bm{x}_2`, for any scalar :math:`k`. We have only to show for general independence that any third eigenvector can not be a linear combination of other eigenvectors. This is a proof by contradiction, so we begin by considering an assertion that we will prove to be false. If eigenvector :math:`\bm{x}_3` is a linear combination of :math:`\bm{x}_1` and :math:`\bm{x}_2`, then there must exist constants :math:`k_1` and :math:`k_2` such that .. math:: \bm{x}_3 = k_1\,\bm{x}_1 + k_2\,\bm{x}_2. :label: eq-ind-eig1 Since :math:`\mathbf{A}\,\bm{x}_3 = \lambda_3\,\bm{x}_3`, .. math:: \mathbf{A}\,\bm{x}_3 = \lambda_3 (k_1\,\bm{x}_1 + k_2\,\bm{x}_2) = \lambda_3\,k_1\,\bm{x}_1 + \lambda_3\,k_2\,\bm{x}_2. Vectors :math:`\bm{x}_1` and :math:`\bm{x}_2` can be substituted for :math:`\bm{x}_1 = \frac{1}{\lambda_1}\mathbf{A}\,\bm{x}_1` and :math:`\bm{x}_2 = \frac{1}{\lambda_2}\mathbf{A}\,\bm{x}_2`. .. math:: :label: eq-ind-eig2 \mathbf{A}\,\bm{x}_3 = \frac{\lambda_3\,k_1}{\lambda_1}\mathbf{A}\,\bm{x}_1 + \frac{\lambda_3\,k_2}{\lambda_2}\mathbf{A}\,\bm{x}_2 Pre-multiplying both sides of :eq:`eq-ind-eig2` by :math:`\bf{A}^{-1}` removes the :math:`\bf{A}` matrices leaving :eq:`eq-ind-eig1` and :eq:`eq-ind-eig2` as equivalent equations. If :eq:`eq-ind-eig1` and :eq:`eq-ind-eig2` are true, then it must be that :math:`\lambda_3 = \lambda_1` and :math:`\lambda_3 = \lambda_2`, so :math:`\lambda_1 = \lambda_2 = \lambda_3`, which is a contradiction of the initial statement that each eigenvalue is distinct. Therefore, :eq:`eq-ind-eig1` is false. If each eigenvalue is distinct, then all eigenvectors are linearly independent.