Mathwizurd.com is created by David Witten, a mathematics and computer science student at Stanford University. For more information, see the "About" page.

Eigenvectors and Eigenvalues

MathJax TeX Test Page In previous chapters, we've talked a lot about how we can represent linear transformations $T(x)$ as matrix $A \cdot x$. So, we define an eigenvector as a vector whose direction doesn't change from the transformation. It may be scaled or negated, but the vector always stays on its line.

An eigenvalue is how much it is scaled. For example, if a vector $\begin{bmatrix}2 \\ 3\end{bmatrix}$ becomes $\begin{bmatrix}3 \\ 2\end{bmatrix}$ after a transformation, it's not an eigenvector. However, if it becomes $\begin{bmatrix}-4 \\ -6\end{bmatrix}$, it is an eigenvector, and its eigenvalue is -2.

Formal Definition

MathJax TeX Test Page An eigenvalue of an $n \times n$ matrix A is a nonzero vector x such that Ax = $\lambda$x for some scalar $\lambda$, called an eigenvalue ONLY if there is a nontrivial (x is nonzero) solution of that equation.

Note- $\vec{0}$ is never an eigenvector, because that works for every possible matrix.

Finding Eigenvectors and Eigenvalues

We're going to dedicate an entire post to calculating eigenvalues and eigenvectors, for now it's important to know this:

MathJax TeX Test Page $$Ax = \lambda{}x$$ $$Ax - \lambda{}x = 0$$ $$Ax - \lambda{}Ix = 0$$ $$(A - \lambda{}I)x = 0$$ So, you have to find the null space of the matrix $A - \lambda{}I$, and you do this by row reducing the augmented matrix. This gives you a basis of the eigenspace. You then plug in the two vectors and get the eigenvalues.

Theorem: Triangular Matrix

The eigenvalues of a triangular matrix are the entries on its main diagonal.

Proof

MathJax TeX Test Page In order for a matrix to have nontrivial solutions to the equation Ax = 0, it must have at least one free variable. This is a theorem we prove in the "Rank" post. The resulting matrix is $\begin{bmatrix}a_{11} - \lambda{} & a_{12} & a_{13} \\ 0 & a_{22} - \lambda{} & a_{23} \\ 0 & 0 & a_{33} - \lambda{}\end{bmatrix}$. If there is at least one free variable, then there must be one row that isn't a pivot row. Therefore, there is a free variable whenever $\lambda{}$ equals either $a_{11}, a_{22}, or a{33}$.

Theorem 2: Eigenvectors corresponding to distinct eigenvalues are linearly independent

Application

This lets us construct vectors as linear combinations of eigenvectors, which lets us calculate matrix multiplication really easily.

MathJax TeX Test Page Let's let $x = c_1\cdot v_1 + c_2 \cdot v_2$. Let $\lambda{}_1 = 1$, and $\lambda{}_2 = 2$. $$A^k\cdot x = A^k(c_1\cdot v_1 + c_2 \cdot v_2)$$ $$ = c_1\cdot A^k v_1 + c_2 \cdot A^k v_2 $$ $$ = c_1 \lambda{}_1^k v_1 + c_2 \lambda{}_2^k v_2$$ $$ = c_1 v_1 + c_2 \cdot 2^k v_2$$

The Characteristic Equation

Dimension