.. _OrthFactoring: Matrix Factorizations ===================== When solving a linear algebra problem, matrix factoring is usually the first and most important step. Consider, for example, the available options for solving a system of linearly independent equations of the form :math:`\mathbf{A}\,\bm{x} = \bm{b}`. For pedagogical reasons, beginning students might be instructed to use Gaussian elimination with an augmented matrix. Computing the inverse of :math:`\bf{A}` is not a good option. However, if :math:`\bf{A}` is square, then LU decomposition followed by a substitution algorithm works well. If :math:`\bf{A}` is over-determined, then a numerically stable strategy is to use the orthogonal QR decomposition. A substitution algorithm can quickly find a solution if the factoring procedure yields a diagonal or triangular system. If a matrix factor is a unitary matrix, the transpose operation finds its inverse. When elimination procedures are used to factor a matrix, each step of the procedure shifts a matrix element to zero, which is offset by entries in another matrix factor. Multiplication of the factors restores the original matrix. For example, in LU decomposition (:math:`\mathbf{A} = \mathbf{L\,U}`), row operations are applied to :math:`\bf{A}` to produce the upper triangular :math:`\bf{U}`. Correspondingly, entries are added to the lower triangular :math:`\bf{L}` such that multiplying :math:`\bf{L}` and :math:`\bf{U}` together yields a product equal to :math:`\bf{A}`. The objective is to produce triangular factors that can be used with substitution to solve a system of equations (:math:`\mathbf{A}\,\bm{x} = \mathbf{L\, U}\,\bm{x} = \bm{b}`). Orthogonal matrix factoring algorithms use a different strategy. Transformation matrices are designed to change matrix elements to zeros with multiplication. For example, the QR decomposition (:math:`\mathbf{A} = \mathbf{Q\,R}`) algorithm sequentially uses Householder transforms, :math:`\mathbf{H}_i`, that when multiplied to :math:`\bf{A}` give zeros below the diagonal of the :math:`i{\text{th}}` column. Matrix :math:`\bf{R}` starts as :math:`\bf{A}` but moves closer to upper triangular form with each procedure on the columns (:math:`\mathbf{R}_i = \mathbf{H}_i\,\mathbf{R}_{i-1}`). The product of transposed transformation matrices forms the :math:`\bf{Q}` matrix, :math:`\mathbf{Q}_i = \mathbf{Q}_{i-1}\,\mathbf{H}_i^H`. .. index:: elimination procedures .. index:: orthogonal matrix factoring .. index:: matrix factorization