12.4.4. Eigenvectors Following the Schur Decomposition

After the Schur decomposition from either the QR or Francis algorithm using Francis steps of degree one, the Schur vectors, \bf{Q}, are used when we find the eigenvectors. Only the eigenvectors, \bf{V}, of the upper triangular matrix \bf{T} are needed to compute the eigenvectors of \bf{A}, which is implemented in function UTeigVect. The final eigenvectors matrix is found with multiplication.

\mathbf{X} = \mathbf{Q\,V}

Again, we need to find the null solution of each characteristic matrix. But the triangular structure of \bf{T} allows for a more efficient solution than finding the SVDs. Let \mathbf{C}_k be the kth characteristic matrix, \mathbf{C}_k = \mathbf{T} -
\lambda_i\,\mathbf{I}. Because the eigenvalues are on the diagonal of \bf{T}, the kth diagonal element of \mathbf{C}_k is zero. We will partition \mathbf{C}_k such that \mathbf{S}_{11} is size k{\times}k with a zero in the last diagonal position.

(12.7)\mathbf{C}_k = \mat{\mathbf{S}_{11} \mathbf{S}_{12};
                   \mathbf{0} \mathbf{S}_{22}},
    \qquad \mat{\mathbf{S}_{11} \mathbf{S}_{12};
                \mathbf{0} \mathbf{S}_{22}} \, \vector{\bm{v}_1; \bm{v}_2}
                   = \vector{\bm{0}; \bm{0}}

\mathbf{S}_{11} and \mathbf{S}_{22} are upper triangular, while \mathbf{S}_{12} is square. \mathbf{S}_{12} is non-singular because all its diagonal elements are nonzero. Therefore, \bm{v}_2 must equal zero. So equation (12.7) is reduced to \mathbf{S}_{11}\,\bm{v}_1 = \bm{0}. We can further partition \mathbf{S}_{11} to separate its last row and column from a, non-singular, upper triangular matrix \bf{S}.

\mathbf{S}_{11} = \mat{\mathbf{S} \bm{r}; \bm{0} 0}, \qquad
    \mat{\mathbf{S} \bm{r}; \bm{0} 0}\,\vector{\hat{\bm{v}}_1; 1}
    = \vector{\bm{0}; \bm{0}}

The solution may then be found with back substitution of the upper triangular matrix \bf{S} with \bm{r}.

\mathbf{S}\,\hat{\bm{v}}_1 + \bm{r} = \bm{0}, \qquad
    \hat{\bm{v}}_1 = -\mathbf{S}^{-1}\,\bm{r},

where \mathbf{S} \in \mathbb{C}^{(k-1){\times}(k-1)}, \bm{r} \in \mathbb{C}^{(k-1){\times}1}, and \hat{\bm{v}}_1 \in \mathbb{C}^{(k-1){\times}1}. The jth element of the kth eigenvector of \bf{T} is then given by

v_{j,k} = \begin{cases}
\left(-\mathbf{S}_k^{-1}\,\bm{r}_k\right)_j&\text{for$j\in\{1,\cdots,k-1\}$} \\
        1&  \text{$j = k$} \\
        0&  \text{for $j \in \{k+1, \cdots, n\}$}.
    \end{cases}

Finally, each eigenvector is scaled to a unit vector.

function V = UTeigVect(T)
% UTEIGVECT - Find the eigenvectors of upper triangular matrix T.
%           The eigenvalues should be supplied on the diagonal of T.
% The structure of T allows finding the eigenvectors with triangular
% back substitution rather than as the null solution of the
% characteristic equation.

    n = size(T,2);
    V = eye(n);
    I = eye(n);
    for k = 2:n
        C = T - T(k,k)*I;
        S = C(1:k-1, 1:k-1);
        r = C(1:k-1, k);
        % solve with triangular back substitution
        V(1:k-1, k) = -S/r;
        V(1:k, k) = V(1:k, k)/norm(V(1:k, k));
    end
end

Note

To test the correctness of the eigendecomposition, the MATLAB command A*X - X*L should produce a matrix with values close to zero. The eigenvalues should be the same as returned from MATLAB’s eig function. However, complex eigenvectors can be different and still be correct because they may be multiplied by a complex scalar. Eigenvectors are invariant to scale. That is to say that if \bm{x} is an eigenvector of \mathbf{A}, so is c\,\bm{x} where c is any scalar constant. For convenience to users, functions that return eigenvectors should always return unit length vectors. For real eigenvectors, two different functions might return vectors multiplied by -1 of each other. That is easy to detect. But complex eigenvectors could be multiplied by any unit length number on the complex plane (e^{i\,\theta}) where \theta \in \{0 \text{ to } 2\,\pi\}. Thus, two correct sets of eigenvectors can appear different. Instead, enter the command norm(A*X - X*L, ’fro’), which will return a number close to zero if the eigenvalues and eigenvectors are correct.