9. Singular Value Decomposition (SVD)

We now consider the matrix factorization that is perhaps the most important factoring in linear algebra. We introduced the SVD in The Singular Value Decomposition because we needed it for solving systems of linear equations, especially rectangular systems. In this chapter, we present a more complete picture of the SVD and its applications.

The importance of the SVD can be summarized with three properties.

  1. The SVD can factor any matrix, even singular and rectangular matrices.
  2. The SVD is used to solve many linear algebra problems and has application to many science and engineering fields of study including artificial intelligence, data analytics, and image processing. The linear algebra applications of the SVD include low-rank matrix approximations, finding the rank of a matrix, finding the pseudo-inverse of rectangular matrices, finding orthogonal basis vectors for the columns of a matrix, finding the condition number of a matrix, and finding the null space and the left null space of a matrix.
  3. The SVD is now fast and numerically accurate. That was not always the case. A framework for improved algorithms for finding the SVD was introduced in 1965. A good implementation based on the Francis algorithm was available by 1970, and a small improvement to the algorithm was published in 1990.

This fast algorithm and the vast application domain of the decomposition has resulted in the SVD being a capital algorithm in software such as MATLAB. For several applications, it has replaced elimination based algorithms in the internal workings of MATLAB and other software.

Note

Our discussion of the SVD focuses primarily on its interpretation and application. The material in Calculating the SVD with Unitary Transformations, reviews the modern algorithm for finding the SVD including a source code implementation. That material is very interesting, but not a required part of the course.

The history of the SVD

The official introduction of the SVD came in 1873 when Eugenio Beltrami published a paper describing it. Unaware of Beltrami’s paper, Camille Jordan independently published a paper describing it a year later. Jordan’s paper is more complete and accurate in the details presented. Beltrami and Jordan built on previous works related to the eigenvalues and eigenvectors of symmetric matrices, which are central to the classic SVD algorithm. Augustin-Louis Cauchy established the properties of eigenvalues and eigenvectors of symmetric matrices in 1827. Then in 1846, Carl Jacobi presented the diagonalization factorization of symmetric matrices. Stewart gives a nice summary of the early history of SVD and related topics [STEWART93].

Cleve Moler, co-founder of MathWorks, observed in a blog post that the SVD was of little practical value until 1970 [MOLER06]. Even when computers became available, the algorithm was too slow to be useful. Then in the time frame of 1965 to 1970, Gene Golub, William Kahan, and Christian Reinsch developed an algorithm to quickly find the SVD [GOLUB70]. The algorithm that they developed is not only faster than previous algorithms, but it also provides results that are numerically more accurate [YETURU20], [LAMBERS10].