Now we're ready to start the heart of the
course, which is the study of
eigenvalues and eigenvectors. So as with
all mathematical concepts, there are three
questions you have to ask yourself when
you get a new idea:
What is it? How do you compute it?
And what's it good for?
Now this video is going to be all about
what is it, the definitions of eigenvalues
and eigenvectors. And then the next video
is going to be how do you compute it.
And what's it good for is going to be most
of the rest of the course,
although we saw a fair amount of that
already in chapter one,
the de-coupling principle.
Eigenvalues and eigenvectors are all about
de-coupling matrices and linear
transformations.
Okay, so there's one definition for
matrices and another for linear operators.
So let's do the matrices first.
So let's suppose we have a square matrix,
an n by m matrix, and we have a nonzero
vector and if it happens that the matrix
times a vector is a multiple of the
vector, that is, the matrix times
the vector is pointing the same direction
as the vector, then we call the vector
an eigenvector and we call the scalar
multiple, an eigenvalue.
So for example, let's start off with one
of the simplest matrices you could
write down, just a diagonal matrix
with one, zero, zero, two.
One zero is an eigenvector because if
you multiple one, zero, zero, two
by one zero, you get one zero.
So one zero is an eigenvector
with eigenvalue 1.
Zero one is an eigenvector with
eigenvalue two because if you multiple
A times zero one you get zero two.
On the other hand, one one isn't an
eigenvector at all.
It's not an eigenvector because if you
multiple the matrix by one one,
you get one two and that's not a multiple
of one one. In general, most vectors
are not going to be eigenvectors and most
numbers are not going to be eigenvalues.
It's only very special vectors and very
special numbers who are going to be
eigenvalues and eigenvectors.
Okay, now if you have an eigenvalue
and and eigenvector, any multiple of that
eigenvector is still going to be an
eigenvector. Now we can ask,
what are all of the eigenvectors
associated with a particular eigenvalue?
So in other words, what are all the
solutions to Ax equals lambda x?
And we call that an eigenspace.
Now strictly speaking, there is one
solution to this equation that is not
an eigenvector. The zero vector is in
every eigenspace, but in our definition
of eigenvectors we had to say
a nonzero vector because of course the
zero vector satisfies this equation
for every value of lambda,
that's cheating.
We're interested in nonzero vectors to
be eigenvectors, but zero does count
as being in the eigenspace.
So let's look again at our matrix
one, zero, zero, two.
The eigenspace with eigenvalue one is
all multiples of one zero, so there it is
in red, the eigenspace E1.
The eigenspace with eigenvalue two
is all multiples of zero one, so there
is E2. So what this operator does,
what this matrix does, is it stretches
things by a factor of one
in the horizontal direction. It stretches
things by a factor of two
in the vertical direction. And if you 
take any vector that's not horizontal
or vertical, if you take that vector to be
x, then Ax will not be parallel to x.
And since it's not parallel, this is not
an eigenvector. The only eigenvectors
are in E1 or in E2. And again, remember
zero does not count as an eigenvector,
but it is in every eigenspace.
Okay, that's it for matrices. Let's talk
about linear operators.
So let's suppose we have a vector space
v and a linear operator that sends v
to itself, and we have a nonzero vector
which has a property that when you
feed it to L, you get a multiple of what
you started from. Well, then v is called
an eigenvector and lambda's called an
eigenvalue. It's really the same
definition that we had for matrices,
it's just that instead of saying
A times x, we say L of v.
So for example, let's suppose that we had
the space of smooth complex valued
functions of a real variable,
and then L will be the derivative.
If you think about the function
e to the 3t, the derivative of
e to the 3t is 3 e to the 3t.
So this is an eigenvector of the
derivative operator with eigenvalue 3.
And if you look at the function
cosine(t) plus i sine(t),
that's an eigenvector with eigenvalue i.
Let's see why.
If you take its derivative, the derivative
of cosine is negative sine,
the derivative of i sine is
i times cosine, and what you get
is i times cosine plus i sine. So this is
a function who's an eigenvector
with eigenvalue 3. This is an eigenvector
with eigenvalue i.
Okay, so those are the definitions.
Now this happens to be an infinite
dimensional vector space. Let's look at
a finite dimensional vector space.
Let's suppose that a vector space has a
basis with n elements in it,
so it's an n dimensional vector space,
and it should behave more or less
like Rn. And I claim, that by looking at
the matrix of L, the matrix of L is an
n by n matrix. And you can ask what are
the eigenvalues and eigenvectors of that
n by n matrix. And I claim the eigenvalues
of that matrix are exactly the eigenvalues
of the linear operator. And the
eigenvectors of that matrix are
the coordinates of the eigenvectors
of the linear operator.
So if you can find the eigenvalues and
eigenvectors of the matrix,
you have essentially found the eigenvalues
and the eigenvectors of the operator.
And the reason is, if we want to find the
eigenvalues and eigenvectors
of the operator, we want to find nonzero
vectors that satisfy this equation.
But something satisfies this equation,
L of v equals lambda v,
if and only if the coordinates of L of v
are the same as the coordinates
of lambda v. And how do you get the
coordinates of L of v?
Well you take the coordinates of v
and you multiple by the matrix of L.
And how do you get the coordinates
of lambda v? Well that's just
lambda times the coordinates of v.
So v is an eigenvector with
eigenvalue lambda, if and only if the
coordinates of v give you an eigenvector
of this matrix with the same eigenvalue.
So we're going to develop a whole
bunch of techniques for figuring out
eigenvalues and eigenvectors of matrices,
and then whenever you're given a linear
operator, you just find the matrix
of that operator and apply the same
techniques and you'll find the eigenvalues
and eigenvectors of the operator.
