Okay, in the last two videos, 
we talked what are eigenvalues
and eigenvectors.
Now we're gonna begin the process 
of figuring out how to compute
eigenvalues and eigenvectors.
This all has to do with something called
the characteristic polynomial of a matrix.
It only helps to find the eigenvalues.
We'll worry about the eigenvectors later.
The object of the game is somebody hands
us an n by n matrix, and for our purposes,
let's just think in terms of the matrix
2 1 1 2. It's a nice, simple example.
We want to find the eigenvalues and 
we'll worry about the eigenvectors later.
We want to have a test to see whether
a number is an eigenvalue.
Once we've got a test, we can figure out
what are the numbers that pass the test
and then we'll have our eigenvalues.
So let's walk through this.
If a number lambda is an eigenvalue, that
means that we can find an eigenvector.
That is to say we can find a nonzero
vector, such that Ax is lambda(x).
Another way to write lambda(x) is 
lambda times the identity matrix times x.
Because of course, the identity matrix 
times x is just x.
If we can do that, then we can put
the lambda times the identity on the other
side of the equation and we can write
lambda times the identity minus A times x
is 0 for some non-zero x.
Except most matrices, the only solution
to that matrix times x equals 0 is
the trivial solution.
If you have a nontrivial solution, 
that means that there's something
special about this matrix.
It means that when you row reduce it, 
you don't get the identity.
You get some other matrix.
You've got fewer than n pivots.
The columns are linearly independent so
you can find some linear combination at 0.
How can we test whether
a matrix is singular?
Well we have this magic oracle
called the determinant.
That if the determinant is 0,
then the matrix is singular.
If the determinant is non-zero, 
then the matrix is non-singular.
In other words, lambda is an eigenvalue,
if and only if,
this funny determinant is 0.
We give this funny determinant a name.
We call it the characteristic
polynomial of A.
It depends on lambda of course.
It's a polynomial in the variable lambda,
and it depends on what the matrix A is.
Different matrices will have 
different polynomials.
This is the characteristic 
polynomial of A.
The eigenvalues of A are exactly those 
values of lambda that make the
characteristic polynomial equal to 0.
The polynomial itself, when you compute
this n by n determinant, you're always
gonna get n powers of lambda.
So it's an nth order polynomial and then
you get other terms.
It's not exactly lambda^n. It's lambda^n
plus something, lambda^n - 1,
all the way down.
It's an nth order polynomial, and you know
that an nth order polynomial has at most
n roots.
That means that an n by n matrix can have
at most n eigenvalues.
If I give you a 2 by 2 matrix and 
you found 2 eigenvalues, you're done.
It can't possibly have anymore than that.
Let's work an example.
Our favorite matrix, the matrix 2 1 1 2,
and then the identity matrix is 1 0 0 1.
You multiply that by lambda.
You get lambda 0 0 lambda.
Now we want to look at the matrix,
(lambda * the identity) - A.
You can just put lambdas in the diagonal
and then you copy all of the entries of A,
only with a minus sign.
Then the characteristic polynomial is
the determinant of that matrix.
You know how to take the 
determinant of a 2 by 2 matrix.
It's this times this, minus
this times this.
It gives you (lambda - 2)^2 -1.
You multiply it all out and you get
(lambda^2) - 4(lambda) + 3.
The characteristic polynomial of this
matrix is (lambda^2)-4(lambda)+3.
Okay. You got the 
characteristic polynomial.
Next thing is to find the eigenvalues.
Well those are just the roots 
of this quadratic polynomial.
You can use the quadratic formula.
You plug it in.
(4 + or - 2)/2, and you get 
the roots are 1 and 3.
That means that the eigenvalues
of the matrix are 1 and 3.
Now again, that doesn't tell you
the eigenvectors.
It just tells you what the
eigenvalues are.
We'll tackle the eigenvectors
in the next video.
