Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Math 208 Interactive Notebooks © 2024 by Soham Bhosale, Sara Billey, Herman Chau, Zihan Chen, Isaac Hartin Pasco, Jennifer Huang, Snigdha Mahankali, Clare Minerath, and Anna Willis is licensed under CC BY-ND 4.0
License: OTHER
Image: ubuntu2204
Chapter 6: Eigenstuff (Eigenvectors, Eigenvalues, Eigenbases, Eigenetc.)
by Anna Willis, for Sara Billey's 2024 WXML group. This follows Chapter 6 of Jeffrey Holt's Linear Algebra with Applications.
This part of the tutorial deals solely with eigen-related Sage functionality, and assumes you are already aware of basic Sage syntax. So if you aren't, then shoo! Off to the previous chapters you go!
If you are confident you know everything about characteristic polynomials, eigenvalues, eigenvectors, eigenbases, and diagonalization, and you just want to see how you can use Sage to calculate the stuff for you, scroll all the way to the final section. If you aren't, though, the rest of this chapter of the tutorial walks through the mathematical process of calculating these things, with some help from SageMath. I have also provided some example questions and solutions to help you on your eigen-journey.
Don't code anything you don't understand how to do yourself. You'll regret it when the exams come around! Trust me.
Definition of Eigenvectors and Eigenvalues
Let A be a square* matrix. A vector** is an eigenvector of A if and only if there exists some scalar λ such that
Likewise, a scalar λ is an eigenvalue of A if and only if there exists some vector** for which the above holds.
A useful property that follows from this definition is that all scalar multiples of an eigenvector of A are also eigenvectors of A, with the same eigenvalue λ. Try to prove this for yourself!
*This is important to remember. Non-square matrices do not have eigenvectors/values. This is because non-square matrices multiplied by input vectors of the appropriate dimension will not ever produce output vectors of the same dimension as the input. Thus, the output vectors cannot ever be scalar multiples of the input vectors, so there will be no eigenvectors. 😦
**Note that we do not consider the zero vector to be an eigenvector.
Visual Example of Eigenvectors and Eigenvalues (with SageMath and pyplot)
In my opinion, it's easier to understand what an eigenvector is by getting a clear look at it than it is reading a definition. So I'll provide an example with graphs of eigenvectors and eigenvalues, using Sage's built in plot functionality, which is super easy and convenient to use.
Let's start with our beloved unit square. It has corners (0, 0), (0, 1), (1, 0), and (1, 1). I'll plot it here:
Now, take the linear transformation , defined by the matrix
I will now apply the linear transformation to the vectors in the unit square, and display the vectors after the transformation:
Take a look at the vectors before and after applying the linear transformation. The pictures look pretty different. But you'll notice that the magenta vector, pointing to the top left corner of the unit square, remains the same. We can verify this numerically as well:
Because the magenta vector <0, 1> is the exact same (scaled by a factor of 1) following the transformation, it is an eigenvector of the matrix A with the eigenvalue λ = 1. If you came to this tutorial without knowing what eigenvectors are yet, hopefully now you know. And if you came to this tutorial knowing what eigenvectors are... well, I hope you still know.
Finding the Characteristic Polynomial and Eigenvalues
As we recall from the definition, a scalar λ is an eigenvalue of a square matrix A if and only if there exists some nonzero vector v such that
We can do some algebra on this definition to get it into a form more convenient for actually finding these eigenvalues λ.
Now, this vector cannot be . So, for to produce , the matrix A - λI must not be one-to-one, as it is guaranteed that is always . This means that .
So, if and only if .
This is very handy for finding eigenvalues - you simply find and find which λ make it 0. is also known as the characteristic polynomial of A, which you can use SageMath to find with the charpoly() or fcp() (Factored Characteristic Polynomial) function. This returns a string representing the characteristic polynomial of a matrix. Let's take our matrix A from the last section:
SageMath is a heathen and uses x instead of λ by default. You can provide your own variable name in quotes to the charpoly() or fcp() arguments, like this:
Setting the characteristic polynomial to 0, we get an equation to find our eigenvalues λ. SageMath can even give you the factored characteristic polynomial, making it even easier to identify the eigenvalues λ.
We can also use the SageMath solve() function to really spell it out for us (once again using x instead of λ, because you can't use λ in variable names):
We expected to see an eigenvalue 1, as the magenta vector <0, 1> stayed exactly the same following multiplication with A. So, we have confirmed our result from the previous section.
Note that we do indeed list eigenvalues as many times as they appear in the characteristic polynomial. The root λ = 1 appears twice, so we would write the eigenvalues of A as λ = 1, 1.
While we just went through the process of finding the characteristic polynomial, setting it to 0, and solving for the eigenvalues λ, SageMath can just find the eigenvalues for you:
The eigenvalues() function returns all of A's eigenvalues in an unsorted list, including repeats. Simple enough.
Practice Problem 1: Characteristic Polynomials and Eigenvalues
Find the characteristic polynomial and eigenvalues of this matrix:
Run the code below to check your answers, and look at the solution below for an explanation of the calculations.
SOLUTION
The characteristic polynomial of a matrix M is .
So, the characteristic polynomial of M is .
This is a nice characteristic polynomial. Set it to 0 to get your eigenvalues λ:
Finding Eigenvectors
Once you have the eigenvalues λ, it's simple to find the corresponding eigenvectors. You just need to find the vector for your eigenvalue λ such that
.
This entails solving a system of linear equations. We know how to do that.
Let's take our matrix A and its only eigenvalue λ = 1, and compute A - λI first:
Looking at the matrix
,
we can see that vectors that satisfy need only have a 0 as their first coordinate. So, the eigenvectors are of the form s<0, 1> where s is a scalar. This again checks out with what we saw visually, with <0, 1> staying the same before and after the transformation, and all scalar multiples of eigenvectors being eigenvectors themselves.
But if this were more complicated a system, or if you're lazy, SageMath can help:
As you can see, though, you shouldn't let SageMath's solve_right() be the final word. While the vector <0, 0> does indeed satisfy this system - as it should, because any matrix multiplied by the zero vector will produce the zero vector - it's not the answer we're looking for. What you really want to ask SageMath for is the kernel of , or the set of vectors v that satisfy .
And there you have it. The basis for A's eigenvectors with eigenvalue λ = 1 is {<0, 1>}.
Instead of going through all of these steps, you could just ask SageMath for the eigenvectors with the eigenvectors() function:
The output of the eigenvectors_right() and _left() functions is an array of triples. Each triple is of the form (eigenvalue λ, list containing an eigenvector for that eigenvalue λ, multiplicity). A's only eigenvalue is λ = 1, an eigenvector for λ = 1 is <0, 1>, and the eigenvalue λ = 1 appeared 2 times as a root of A's characteristic polynomial set to 0. So SageMath gives us [(1, [(0, 1)], 2].
Practice Problem 2: Finding Eigenvectors
Find the eigenvectors of this matrix:
Recall that you already found the eigenvalues of this matrix in the previous practice problem: λ = 5, 2.
Run the code below to check your answer, and look at the solution below for a more detailed explanation.
SOLUTION
To find the eigenvectors of M corresponding to λ = 5, find the vectors that satisfy this equation:
.
It's clear to see that any vector with a second coordinate of 0 will satisfy this equation, and thus be an eigenvector with eigenvalue 5. A good example is <1, 0>.
To find the eigenvectors of M corresponding to λ = 2, find the vectors that satisfy this equation:
So, any scalar multiple of the vector <2, -1> will be an eigenvector with eigenvalue 2.
Finding Eigenbases
As usual, I'll start off with a quick definition.
Let A be a square matrix of dimensions n x n. An eigenbasis is a basis of consisting of eigenvectors of A.
Eigenvectors with different eigenvalues are always linearly independent. This is, as previously mentioned, because all scalar multiples of an eigenvector with eigenvalue λ will also have eigenvalue λ. So, if A has n distinct eigenvalues, A will also have an eigenbasis of .
Let's take the matrix A we have been using this whole time:
We can't get an eigenbasis from A. It only has one distinct eigenvalue, λ = 1.
So goodbye A, and onto our new matrix :
B is a 2x2 matrix. So to get an eigenbasis from B, B needs to have 2 distinct eigenvalues. Let's verify that it does:
5 and 2 are indeed different numbers (bet you didn't know that), so we can get an eigenbasis from B. Let's get its eigenvectors:
The eigenvalue λ = 5 has eigenvectors of form s<1, 1>, and λ = 2 has eigenvectors of form t<1, -1/2>, where s and t are scalars. Let's get the eigenbasis matrix:
There's another way to obtain the eigenbasis matrix with built-in Sage functionality that I'll go over in the next section, on diagonalization. I think it's important conceptually to think about the process of finding the eigenbasis, though, as outlined in my helper function get_eigenbasis_matrix().
Diagonalization
Firstly, a matrix is a diagonal matrix if all of its entries are 0 except for those along the diagonal. Here's an example of a diagonal matrix:
A matrix M is diagonalizable if there exist some invertible matrix P and some diagonal matrix D such that this is true:
You may be wondering why this is important or useful. Well, diagonalizing a matrix makes it much easier to raise to a power. Raise a diagonalizable matrix M to the third power, for example:
More generally, all of the P and P-inverses in the middle cancel out, meaning that:
Now let's get into how to diagonalize a matrix M. This means identifying the matrix P and the diagonal matrix D.
P is an eigenbasis matrix of M. So if you can't get an eigenbasis from M, M is not diagonalizable.
We don't have to worry about P being invertible, because as we know, the eigenbasis matrix comprises M's linearly independent eigenvectors, and thus by the Unifying Theorem it is invertible.
For the matrix B we've been using:
The diagonal matrix D has the eigenvalues corresponding to the eigenvectors in P as the elements on the diagonal. The eigenvalue associated with B's eigenvector <1, 1> (first column of P) is λ = 5, so the first element in D's diagonal is 3. The eigenvalue associated with B's eigenvector <1, -1/2> (second column of P) is 2, so the second element in D's diagonal is 2.
To verify that this is indeed the correct diagonalization of B, let's use SageMath to multiply out PDP^-1.
So, we've got it!
Of course, you can use SageMath to skip this whole process. But you should understand the process before you wave your hands around and have Sage do it for you.
This returns a tuple, in which each entry is a matrix. The first matrix (element 0 in the tuple) is D, and the second matrix (element 1 in the tuple) is P.
So, our own diagonalization is correct, at least according to Sage.
Note that there is another possible diagonalization for B, if you switched the order of the eigenvectors/values. SageMath only gives you one of the options, but this is totally fine too:
Before you try and diagonalize a matrix, you can also check if it's diagonalizable with SageMath:
Quick Review of Eigen-related Sage Functionality
That's all! Happy coding, linear algebra-ing, and eigen-ing!