In the last video we talked about
how to find the Eigenvalues of a matrix.
You find the characteristic polynomials,
you find the roots of the characteristic
polynomial
and boom you've got the Eigenvalues.
But what about the Eigenvectors?
So, the object of the game here
is first of all,
remember the definition of 
the Eigenspace.
The Eigenspace is a set of 
all solutions to
Ax=λx
and this is almost the same thing
as the set of all Eigenvectors
with Eigenvalue λ.
The one exception is that zero is in
the Eigenspace
and Eigenvectors are supposed to 
be non-zero.
But aside from that one exception
this is basically the set of all
Eigenvectors with that particular
Eigenvalue.
And so, we need to figure out
what this Eigenspace looks like.
And I claim that an Eigenspace
is actually a null space.
It's a null space of the matrix
A-λ times the identity.
Or if you prefer, it's the null
space of λ times the identity minus A.
This matrix and this matrix 
have the exact same null space
because they're just negatives
of each other.
Solving this times x=0 is 
the same thing as solving this times x=0.
Some people prefer to work with
A-λ times the identity.
Some people prefer to work with
λ times the identity minus A.
You get the same answers
either way.
So, to find a basis 
for Eλ,
well you have to find a basis 
for a null space.
And how do you find a basis 
for a null space?
You row reduce.
So you have to take A-λ times
the identity,
or λ times the identity minus A 
if you prefer,
and you row reduce it
and then you find all of the solutions
to A-λ times the identity x=0
and the claim is that that's
the same thing as the Eigenspace.
So let's see why.
Suppose we have something
that's in the Eigenspace.
That means that Ax=λx 
and of course λx is the same
thing as λ times the identity 
matrix times x.
But that means that A-λ times 
the identity times x=0
or if you prefer you can say 
λ times the identity minus A x=0,
and by definition that means that
x is in the null space
of A-λ times the identity
or it's negative.
So there you have it.
Let's work an example.
Work our favorite example
which is matrix 2 1 1 2.
Now in the last video we computed
the characteristic polynomial
of this matrix and discovered it 
was λ^2 -4λ +3.
And then we found the roots 
of that characteristic polynomial
and they were one and three.
And so now, that tells us the Eigenvalues
of A are one and three.
And we need to figure out what 
the Eigenvectors are.
And we do it one Eigenvalue at a time,
you can't do the whole thing at once.
Each different Eigenvalue 
requires a separate calculation.
So we start with λ=1.
When λ=1, A-λ times the identity 
is just -- well that's A
and that's 1 times the identity.
You subtract it off and you get this
two by two matrix
and you see this is a singular matrix.
When you row reduce it, 
you get something with only one pivot.
Well, now we solve this times x=0.
We only have one equation
and that equation is x_1+x_2=0.
And we rewrite that as an equation
for the pivot variable in terms
of the free variable.
And then we pat it with a 
x_2=x_2
and put these two things together
and its says
the whole vector x, x_1 x_2,
is a certain constant,
namely x_2, times -1 1.
So the basis for this Eigenspace
is just the single vector -1 1.
Now we could've used twice 
that vector,
or three times that vector.
Or minus that vector.
Sometimes it's more convenient
to work with 1 -1 instead of -1 1.
But it's the same direction.
It's the axis that's running 
Northwest to Southeast.
That's one Eigenvalue.
We still have to look at the
other Eigenvalue.
When λ is 3,
A -3 times the identity is 
A -3 times the identity
and that gives us -1 1.
So this is the matrix that 
we have to row reduce
and it row reduces to 1 -1 0 0.
Then we write down our equations -
is only one equation, 
x_1-x_2 is 0.
So x_1 is x_2.
Of course x_2 equals itself
and that means the vector x
has to be some multiple of 1 1.
So our basis for the Eigenspace E_3
is 1 1.
So a picture of the two Eigenspaces
looks like this.
In the 1 1 direction is the 
Eigenspace E_3.
In the 1 -1 direction is the 
Eigenspace E_1
and all other directions give you
vectors that aren't Eigenvectors.
So that's the full story for matrices.
What if we wanted to find
Eigenvalues of a linear operator.
If you had a linear operator,
well, you always try to make things
look like matrices by picking 
a basis.
And I'm not talking about a basis
of Eigenvectors,
I'm just talking about any old basis
for the vector space V.
So you pick a basis,
and we know that if L(x)
is a multiple of x
you just take the coordinates of
both sides
and you get to the matrix of L
times the coordinates of x
equals λ times the coordinates of x.
And that means that the coordinates
of x is an Eigenvector of the matrix of L
with Eigenvalue λ.
In other words, the Eigenvalues
of L are exactly the Eigenvalues of
the matrix of L.
So, the object of the game is you
pick a basis
and it doesn't even matter
which basis you use.
Different basis will give you
different matrices
but those matrices will all
have the same Eigenvalues.
And we'll call the characteristic 
polynomial of L
we'll define that to be just the
characteristic polynomial
of this matrix, and no matter what
basis you use,
you get the same 
characteristic polynomial.
So that's how you find the 
Eigenvalues of L
you find the Eigenvalues of
this matrix
and how do you find the Eigenvectors?
You find the Eigenvectors 
of this matrix
and those are the coordinates 
of the Eigenvectors of L.
