9.7. Other Applications of the SVD

Here we list two additional applications of the SVD to linear algebra: vector projections with the economy SVD and the polar decomposition.

9.7.1. Projection and the Economy SVD

The SVD has application to vector projections as described in Over-determined Systems and Vector Projections. We begin with an observation related to the extra rows of zeros in the \(\bf{\Sigma}\) matrix for over-determined systems. As discussed in SVD of Rectangular Matrices, some columns of the \(\bf{U}\) matrix do not contribute to the final SVD product because they get multiplied by zeros in \(\bf{\Sigma}\).

Over-determined Full SVD
\[\begin{split}\mathbf{A} = \begin{bmatrix} \mathbf{\tilde{U}} & \mathbf{U}_{unused} \end{bmatrix} \begin{bmatrix} \mathbf{\tilde{\Sigma}} \\ \mathbf{0} \end{bmatrix} \mathbf{V}^T\end{split}\]
Economy SVD

The economy SVD removes the unused columns of \(\bf{U}\) and the rows of zeros in \(\bf{\Sigma}\).

\[\mathbf{A} = \mathbf{\tilde{U}} \mathbf{\tilde{\Sigma}} \mathbf{V}^T\]

The economy SVD is a valid factoring. The only noticeable application difference is that \(\bf{\tilde{U}}\) is not unitary: \(\mathbf{\tilde{U}}^T \mathbf{\tilde{U}} = \bf{I}\), but \(\mathbf{\tilde{U}} \mathbf{\tilde{U}}^T \ne \bf{I}\). The pseudo-inverse from the economy SVD can solve over-determined systems of equations (\(\mathbf{A}\,\bm{x} = \bm{b}\)) and projection approximations.

\[\bm{\hat{x}} = \mathbf{A}^{+} \bm{b} = \mathbf{V}\,\mathbf{\tilde{\Sigma}}^{+} \mathbf{\tilde{U}}^T \bm{b}\]

Two pairs of matrices in the projection equation reduce to the identity matrices.

\[\begin{split}\begin{array}{rl} \bm{p} &= \mathbf{A}\,\bm{\hat{x}} \\ &= \mathbf{\tilde{U}\,\tilde{\Sigma}\,V}^T \mathbf{V\,\tilde{\Sigma}}^{+} \mathbf{\tilde{U}}^T \bm{b} \\ &= \mathbf{\tilde{U}\,\tilde{U}}^T \bm{b} \end{array}\end{split}\]

As mentioned in Alternate Projection Equation, orthonormal basis vectors of the \(\bf{A}\) matrix are needed for the projection. The modified Gram–Schmidt algorithm, QR factorization, or \(\bf{\tilde{U}}\) from the economy SVD may be used. The MATLAB function orth uses the economy SVD method to compute orthonormal basis vectors.

The fourProjections script shows four ways to achieve projection for an over-determined system. The plot of the projections is shown in Four equivalent vector projection alternatives. The projection lines appear as one line because they are on top of each other..

% File: four_projections.m
% Comparison of 4 ways to compute vector projections of an
% over-determined system.
%
%% Over-determined System with noise
t = linspace(0,20);
y = 10 - 0.75.*t + 5*randn(1,100);
b = y'; scatter(t,b)
A = ones(100,2); % A is the design matrix
A(:,2) = t';

%% basic pseudo-inverse projection onto the column space of A
x_hat = (A'*A)\(A'*b);
p1 = A*x_hat;

%% Alternate Gram-Schmidt
G = mod_gram_schmidt(A);
p2 = G*G'*b;

%% Econ SVD projection
[U, ~, ~] = svd(A, 'econ');
p3 = U*U'*b;

%% MATLAB's Orth function
O = orth(A);
p4 = O*O'*b;

%% Plot
figure, hold on, scatter(t, b)
plot(t, p1), plot(t, p2), plot(t, p3) plot(t, p4)
hold off
legend('Noisy Data', 'Onto Column Space', 'Gram-Schmidt', ...
    'SVD', 'Orth function')
Four equivalent vector projection  alternatives. The projection lines appear as one line because they are on top of each other.

Fig. 9.9 Four equivalent vector projection alternatives. The projection lines appear as one line because they are on top of each other.

9.7.2. Polar Decomposition

There is another factoring of a matrix that uses the submatrices of the SVD. The polar decomposition splits the matrix into symmetric and orthogonal matrices. The factoring is \(\mathbf{A} = \mathbf{R\,Q}\), which is intended to be a generalization to matrices of the polar representation of vectors on a complex plane, \(\bm{z} = r\,e^{i\,\theta}\), where \(r\) is the scalar length of \(\bm{z}\) and \(e^{i\,\theta}\) gives the direction of the vector according to Euler’s complex exponential formula. In the polar decomposition, \(\bf{Q}\) is a unitary rotation matrix, and \(\bf{R}\) has the same \(\norm{\cdot}_2\) matrix norm as \(\bf{A}\). But with multiplication by a vector, the \(\bf{R}\) matrix will both scale and rotate the vector. It can be found by simply inserting an identity matrix in the form of \(\mathbf{U}^T \mathbf{U}\) into the SVD equation.

\[\begin{split}\begin{array}{rl} \mathbf{A} &= \mathbf{U\,\Sigma}\, \left(\mathbf{U}^T \mathbf{U}\right)\, \mathbf{V}^T \\ &= \left(\mathbf{U\,\Sigma\, U}^T\right) \left(\mathbf{U\, V}^T\right)\\ &= \mathbf{R\, Q} \\ & \hfill \\ \mathbf{R} &= \mathbf{U\,\Sigma\, U}^T \\ & \hfill \\ \mathbf{Q} &= \mathbf{U\, V}^T \end{array}\end{split}\]

The polar decomposition has applications to computer graphics and materials engineering where it is used to decompose stress tensors [BUFFINGTON14].