Eigenvalues and Eigenvectors

Eigenvectors are a special type of vector related to linear transformations. It is a vector with the property that when a linear transformation is applied to it, the output differs from the input by a scalar number.

Put more formally, suppose that L is a linear mapping. If \vec{v} is a non-zero vector, such that L(\vec{v}) = \lambda * \vec{v}, then \vec{v} is an eigenvector of the linear transformation L. Furthermore, we refer to \lambda as the eigenvalue of the linear transformation.

Example: Let A = \begin{bmatrix} 17 & -15 \\ 20 & -18 \end{bmatrix}, and let \vec{v} = \begin{bmatrix} 1 \\ 1 \end{bmatrix}. Show that \vec{v} is an eigenvector of A.

If we want to show that \vec{v} is an eigenvector of A, we need to show that A \vec{v} = \lambda \vec{v}. In this case, we can do direct matrix multiplication, and show that the result differs from \vec{v} by a scalar multiple.

\begin{bmatrix} 17 & -15 \\ 20 & -18 \end{bmatrix} \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 2 \\ 2 \end{bmatrix} = 2 * \begin{bmatrix} 1 \\ 1 \end{bmatrix}

As you can see, the result differs from the original by a constant value 2, therefore, \vec{v} is an eigenvector of A.

From this example, we can see that it is easy to verify if a vector is an eigenvector for some linear transformation. The next step is to derive a way to find eigenvectors for any linear transformation. To do this, consider that the following equation holds true for eigenvectors: A\vec{v} - \lambda \vec{v} = \vec{0}. All we are saying here is that if we apply a linear transformation to \vec{v}, and then subtract the eigenvalue multiplied to the same \vec{v}, we would get \vec{0}.

From here, we can factor out the \vec{v} term to get: (A - \lambda I) \vec{v} = \vec{0}. The idea is that if we can find some \vec{v} that satisfies this equation, then we have an eigenvector.

Theorem: Suppose that A is an n x n matrix. A real number \lambda is an eigenvalue if and only if det(A - \lambda I) = 0. If \lambda is an eigenvalue of A, then all non-trivial solutions to (A - \lambda I) \vec{v} = \vec{0}

This theorem will allow us to solve for eigenvalues, which in turn allow us to construct eigenvectors.

Example: Find all the eigenvalues and eigenvectors of A = \begin{bmatrix} 17 & -15 \\ 20 & -18 \end{bmatrix}

To do this, we need to solve the equation det(A - \lambda I) = 0.

To do this, let’s first construct A - \lambda I. In \mathbb{R}^2, the identity matrix, I, is \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}. If we take \lambda \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}, we get \begin{bmatrix} \lambda & 0 \\ 0 & \lambda \end{bmatrix}

Now, we compute A - \lambda I

\begin{bmatrix} 17 & -15 \\ 20 & -18 \end{bmatrix} - \begin{bmatrix} \lambda & 0 \\ 0 & \lambda \end{bmatrix} = \begin{bmatrix} 17 - \lambda & -15 \\ 20 & -18 - \lambda \end{bmatrix}

If we take the determinate of this matrix, we end up with a quadratic function:

(17 - \lambda)(-18 - \lambda) - (-15)(20)
= (\lambda^2 + \lambda - 6)
= (\lambda + 3)(\lambda - 2)

This gives us two eigenvalues to work with, \lambda = -3 and \lambda = 2

To find our eigenvectors, we just need to determine where (A - \lambda I) \vec{v} = \vec{0}. Notice that this is just a linear system of equations:

(17 - \lambda)x_1 - 15x_2 = 0
20x_1 - (18-\lambda)x_2 = 0

You can solve this system in any way you prefer. In this case, I’m going to use row reduction. I’ll start by solving the case where \lambda = -3. Using this lambda value gives us the system below.

20x_1 - 15x_2 = 0
20x_1 - 15x_2 = 0

This system has a coefficient matrix of:

\begin{bmatrix} 20 & -15 \\ 20 & -15 \end{bmatrix} = \begin{bmatrix} 1 & \frac{-3}{4} \\ 0 & 0 \end{bmatrix}

This shows us that any scalar multiple of \begin{bmatrix} \frac{3}{4} \\ 1 \end{bmatrix} is an eigenvector of A. We can do the same steps with \lambda = 2

15x_1 - 15x_2 = 0
20x_1 - 20x_2 = 0

This system has a coefficient matrix of:

\begin{bmatrix} 15 & -15 \\ 20 & -20 \end{bmatrix} = \begin{bmatrix} 1 & -1 \\ 0 & 0 \end{bmatrix}

This shows us that any scalar multiple of \begin{bmatrix} 1 \\ 1 \end{bmatrix} is an eigenvector as A. Therefore, we now know every possible eigenvector and eigenvalue of A.

Notice that in our example, the determinate used to find the eigenvalues was a quadratic polynomial. In general, we refer to this equation, det(A - \lambda I) as the characteristic polynomial of A. By analyzing the characteristic polynomial, we can conclude details about of eigenvectors, such as how many roots exist, and whether the roots are real and complex.

1 thought on “Eigenvalues and Eigenvectors”

Leave a Reply

Your email address will not be published. Required fields are marked *