Hello, as you may recall
last time we found a linear
transformation which reflected
a vector in R^2 about the line y=-x.
Well today I want to talk about eigenvalues and eigenvectors
and a I wanna give you an intuitive
approach here so this may be a shorter
video.
And then we will look at that the theory later.
I thought that the example we did
last time was a perfect example
of an intuitive example. So
last time we looked at
these two vectors, because we know it the
transformation would do to those.  We
looked at the
vector that was on the line, the
(1 -1) because we knew that that would
just stay there if it reflected about
that line. And then we looked at that
vector (1,1) which was normal to that line,
we knew that that would reflect that
exactly the opposite way.
So this is a good tie in for eigenvalues and eigenvectors.
So really what are eigenvalues and eigenvectors?
Essentially eigenvalues and eigenvectors
look like this. When you have a matrix
A which we found so this is the matrix A
for that reflection about the line
y=-x.
If you take A
times a certain vector and you get
a scalar multiple at that vector back
then we call v
an
eigenvector
for A
and lambda is
a corresponding eigenvalue
for
A.  So essentially what
it means if you look at the
transformation, what it means, if I 
plug a vector
into that transformation multiplied by
that matrix
A, I should get a multiple of that vector
back.
Or if I'm just looking at the eigenvalues and vectors of a matrix, what it
says is if I multiply this matrix times a
vector
I should just get this scalar multiple of it back.
So it really scales these. So
the number the
of eigenvalues
is
equal to the dimension of the matrix.
So in this case bove
we should have two eigenvalues.
So I claim that if we
look up top here that those eigenvalues
should be exactly corresponding
to those
eigenvectors that we're using up
here the one in purple and the one
and red.  So let's take a look at that and
why that might be the case.
If I look at the vector
in the direction of that line
(1,-1), what does the matrix do to it?
They take
A times
the vector being
I get
(0, -1 ,-1, 0)
times (1,-1)
and that's equal to,
Multiply across and down that's 1,
multiply across and down that's -1.
so it's equal to the exact same vector
here. So I bet
A*v is equal to 1
that's my eigenvalue times my vector v
and we knew that because it just stays
there.
And if I look at this normal vector
which was (1,1)
we should see the transformation is just
going to be
the opposite. So T(n)
should equal (-1, -1) so you can almost guess what the
eigenvalue will be this corresponds that. Can you guess? Well
your right, it's gonna be -1.
So if I take A
times an the normal vector I get
(0, -1, -1,0)
times that normal vector (1,1)
and that will give me
this times this -1 this times that
-1 which is equal to a -1
times the original vector I plugged-in.
So you can see that in this case
A*n is
equal to -1
times n. So these
two values here are
eigenvalues they scale them. So we're
looking for the eigenvalues and eigen
vectors that this one
the eigenvalues are 1 and -1.
