8.1. Introduction to Eigenvalues and Eigenvectors

Video Resource

In this lecture, Professor Gilbert Strang of MIT gives a good introduction to eigenvalues and eigenvectors.

The Best Kept Secret of Engineering

Whether you know it or not, you use eigenvalues and eigenvectors everyday. Furthermore, knowing about them can greatly expand your ability to solve interesting and challenging problems. Engineers encounter eigenvalues and eigenvectors when studying mechanics, vibrations, or when working with big data. Google search uses eigenvectors to rank pages, and Netflix uses eigenvectors to predict your preference for a movie that you have not yet watched.

We use eigenvalues and eigenvectors to better understand and simplify expressions involving a matrix. Of course we do not say that a matrix is equivalent to a single scalar value, but when multiplying a matrix by one of its eigenvectors, then the matrix may be replaced by a scalar (its eigenvalue). What an incredibly fantastic simplification! Many problems allow us to represent a matrix in terms of its eigenvalues and eigenvectors. Making this replacement can reduce the computational complexity of a problem and reveal valuable insights about the matrix and the problems we are attempting to solve.

Admittedly, you may need to think about this for a while and see some application examples before appreciating the full value of eigenvalues and eigenvectors. The important thing to remember is that eigenvalues and eigenvectors reveal and take advantage of important properties of matrices. Using a computer to find the eigenvalues and eigenvectors makes them easy to use and apply to various problems.

Matrix multiplication is like yoga to a vector — mostly stretching and rotation. That is, when we multiply a matrix and a vector together (\mathbf{A}\,\bm{x}) the result is usually a stretching and a rotation of the original vector \bm{x}. However, that is not always the case. There are special matrices that rotate a vector, but do not change its length. (See Rotation of a Point for more on rotation matrices.) There are also vectors corresponding to each square matrix that will be stretched by the multiplication, but not rotated. This later case is what we consider here.

Consider the following matrix and three vectors.

\mathbf{A} = \mat{6 1; 8 4}, \; \bm{x}_1 = \vector{1; 2}, \;
\bm{x}_2 = \vector{1; -4}, \; \bm{x}_3 = \vector{1; -1}

\mathbf{A}\,\bm{x}_1 = \vector{8; 16} = 8\,\bm{x}_1, \;
\mathbf{A}\,\bm{x}_2 = \vector{2; -8} = 2\,\bm{x}_2, \,
\mathbf{A}\,\bm{x}_3 = \vector{5; 4}

Fig. 8.1 shows the three vectors and the product of each vector with matrix \bf{A}. The product \mathbf{A}\,\bm{x}_3 shows the typical result of matrix to vector multiplication. The multiplication both stretches and rotates the vector. But in the cases of \mathbf{A}\,\bm{x}_1 and \mathbf{A}\,\bm{x}_2, the vectors are stretched, but not rotated. This is because vectors \bm{x}_1 and \bm{x}_2 are eigenvectors to matrix \bf{A}. The eigenvalues tell us the stretching (scaling) factor for each eigenvector.

../_images/eigVmultiply.png

Fig. 8.1 Eigenvectors are only scaled by matrix multiplication. Other vectors are both scaled and rotated by multiplication.

When a vector \bm{x}_i is one of the eigenvectors of a matrix \bf{A}, the following special relationship holds.

\boxed{\mathbf{A}\,\bm{x}_i = \lambda_i\,\bm{x}_i}

For the n{\times}n square matrix, there are n eigenvectors, \bm{x_i}, and n corresponding scalar eigenvalues, \lambda_i.

The power of this relationship is that when, and only when, we multiply a matrix by one of its eigenvectors, then in a simplified equation we can replace a matrix with just a simple scalar value. This simplification provides an elegant solution to several otherwise complex problems.

What is in the name?

In the German language, the word eigen means own, as in “my own opinion” or “my own family”. It is also used for the English word typical, as in “that is typical of him”. Perhaps the German word eigenschaft meaning feature, property, or characteristic is more clear as to the intent. For a matrix, the eigenvectors and eigenvalues represent the vector directions and magnitudes distinctly embodied by the matrix.

The first known writing about eigenvectors and eigenvalues is by Leonhard Euler in the 18th century. Various mathematicians studied and gave them different names in the next two centuries. German mathematician David Hilbert (1862 - 1943) is credited with naming them eigenvalues and eigenvectors in 1904.