>> Let's talk about
eigenvalues and eigenvectors
of real matrices acting on cn.
And so in this class we're
primarily interested in real 2
by 2 matrices acting on c2.
And we're not really
looking to extend the theory
of linear algebra to complex
numbers, what we're trying
to do is see what the
complex numbers can tell us
about real vector spaces.
So let a be a real 2 by 2 matrix
with complex eigenvalue
lambda equals a minus ib.
Here b is not zero.
And corresponding
complex eigenvector v,
which we write as u plus iw.
And now here a and
b are real numbers.
u and w are vectors in r2.
In that case we know that
a plus ib is also going
to be a complex eigenvalue
with corresponding
eigenvector u minus iw.
And so what this is saying is
if I have a complex eigenvalue
lambda with eigenvector v,
then the complex conjugate of
lambda is also an eigenvalue
with corresponding eigenvector
complex conjugate of v.
Now here's a theorem
that tells us
if we have a complex eigenvalue,
then a is going to be similar
to a very special matrix.
All right, so suppose a
is a 2 by 2 real matrix
with eigenvalue a minus
ib, with b not equal to 0.
And corresponding complex
eigenvector u plus iw.
Then a is similar
to this matrix c.
So a is p times c times p
inverse, where p is going
to be made up of for its columns
it's going to have the vectors u
and w. That's the real
and imaginary parts
of the complex i in vector.
And c is going to have
this very nice shape.
It will be a, minus
b, b, a. So I say it,
this has a very special shape.
What's so good about it?
What's so nice about
this matrix c?
Well, if I'm looking
at a. minus b, b, a,
and lambda's the complex
number a minus ib,
the complex conjugate
of lambda is a plus ib.
And so if I multiply those
two, I get the square
of the complex norm, which
is a squared plus b squared.
And so that's the same value,
that's the same real
number I get
if I just compute determinant
of this 2 by 2 matrix.
And so we'll let r
be the square root
of a squared plus b squared.
So r is just equal to the
complex norm of lambda.
And I'm going to start
decomposing this matrix c. This
a, minus b, b, a. So the first
thing is I can pull off this
scalar matrix r. So this is a
matrix r, 0, 0, r. And I'm left
with this matrix a over r, minus
b over r, b over r, a over r.
And this has a very
nice symmetric pattern,
and it has determinant 1.
This tells me that this has
the shape of a rotation matrix.
And so there's an angle phi
such that this a over r,
minus b over r, b over r,
a over r can be expressed
as cosine phi, minus
sin phi, sin phi,
cosine phi, for some angle phi.
So that tells us that this
matrix c, which we were writing
as a, minus b, b, a, can
be thought of as a scaling
and combined with a rotation.
So for example, let's
go through this example
in all of its details.
So let's first find the complex
eigenvalues and eigenvectors
of this 2 by 2 matrix a,
given by 1 minus 2, 1, 3.
So first I just compute the
characteristic polynomial.
I get lambda squared
minus 4 lambda plus 5.
And so then I can use
the quadratic formula
to find the eigenvalues.
And I see the eigenvalues
are 2 plus or minus i.
So let's first find
the eigenspace
for 2 minus i. All right,
and so when I set it
up I would typically
row reduce this matrix.
All right, so I set up the
augmented matrix and row reduce.
But in this case we
can use a little trick.
Mainly we know that the rows
are going to be scalar multiples
of each other because the fact
that 2 minus i is an
eigenvalue tells me
that there's an eigenvector.
I know there's another
eigenvector corresponding
to the eigenvalue 2 plus
i. So the null space
of this matrix a minus 2 minus
i times the identity is going
to be one dimensional,
which means either
of these rows can be used to
determine the eigenvector.
All right, so let me pick
the bottom row to use
to determine the eigenvector
and I'll have the x1 plus
1 plus i times x2 is 0,
which means if I pick x1 to be
1 plus i, and x2 to be minus 1,
that gives me a non-trivial
solution to this equation.
So it tells me I can pick as an
eigenvector 1 plus i, 1 minus i.
Then the complex conjugate
2 plus i has corresponding
eigenvector 1 minus i minus 1.
All right so I just took
the complex conjugate
of the other eigenvector.
And so now suppose I'm asked
to find this matrix p and c,
so that a is p, times
c, times p inverse,
where c has this special shape
minus -- sorry, a minus b, b,
a. Well, for the matrix
c I'm just going to look
at the coefficients in
this eigenvalue 2 minus i,
and so I get that c
is 2, minus 1, 1, 2.
And for the matrix p I'm
going to take the real
and imaginary parts
of the corresponding
complex eigenvector.
So 1 minus 1 is the first
column and 1, 0 is the second.
Now let's continue
this further and start
to analyze this matrix c. Let's
find the angle of rotation phi
and the scaling factor r. All
right this c and be thought
of as rotation combined
with scaling.
And so to compute r I can
either compute the determinant
of this matrix c and
take its square root
or just compute the complex norm
of lambda, which was 2 minus i.
In either case I get
that r squared is 5.
So r is the square root of 5.
Which means that c can be
written as this scalar matrix
with the square root of 5
times 2 over root 5 minus 1
over root 5, 1 over
root 5, 2 over root 5.
That tells me that cosine
phi is 2 over root 5
and sin phi is 1 over root 5.
This tells me the point
is in the first quadrant
and if I use cosine of
2 over root 5 I see this
as approximately 0.46 gradients.
And so the angle
of rotation is 0.46
and the scaling factor
is square root 5.
