This is the last in our series of videos
on 'Tricks of the Trade', ways to
diagonalize matrices. In this video,
we're going to look at how to use
transposes, row sums, and column
sums to find eigenvalues.
The idea about column sums is
the following. Take a look at this
matrix. You'll notice that each column
adds up to 2. 1 + 3 + -2 makes 2,
4 + -3 + 1 is 2, -1 + 1 + 2 is 2, so what
happens if you multiply this matrix
by (1 1 1)? You get 1 + 3 + -2,
4 - 3 + 1, -2 + 1 + 2, and that is
(2 2 2), so (1 1 1) was an eigenvector
with eigenvalue 2. If the columns
had all added up to 17, then (1 1 1)
would have been an eigenvector,
with eigenvalue 17. Whenever all the
rows of a matrix have the same sum,
then (1 1 1 1 1), however many ones you
need, you know, n ones for an n by n
matrix, that's always gonna be an
eigenvector, and whatever that sum is
is going to be an eigenvalue. So, for
example, suppose we have the matrix
((0 1 1) (1 0 1) (1 1 0)), we've seen this
matrix before, but the sum of the first
row is two, the sum of the second row
is two, the sum of the third row is two,
so all the rows add up to two, and that
means that two has to be an eigenvalue.
And now we can use our other tricks to
find the other eigenvalues.
The trace is 0 + 0 + 0, so all the 
eigenvalues have to add up to 0,
2 plus the other two eigenvalues
is 0, so the other two
eigenvalues have to add up to -2.
The determinant is 2, so 2 times lambda
2 times lambda 3, is 2, so lambda 2 times
lambda 3 has to be 1, and the only
numbers that multiply to 1 and add to
-2 are -1 and -1, so we have 2 is an
eigenvalue, with multiplicity 1,
-1 is an eigenvalue with algebraic
multiplicity 2. Turns out to also have
geometric multiplicity 2, but that's -
you have to do more work to figure
that out. Okay, now, I talked about
transposes. A property of determinants
is that if you take the transpose of
a matrix and then take it's determinant,
that's the same thing as the determinant
of the original matrix, because - well,
then if you take the determinant of
lambda times the identity minus the
transpose matrix, well that's the same
thing as the determinant of lambda
identity minus A transpose, because the
identity transpose is just the identity,
and that's gonna be the determinant
of lambda I minus A, because when
you take the transpose of a matrix,
you don't change the determinant.
But, this is the characteristic polynomial
of A transpose, and this is the
characteristic polynomial of A, so
A and A transpose have the same
characteristic polynomial, well that
means they have to have the same
eigenvalues, so the eigenvalues of
A transpose are always the same
as the eigenvalues of A. Now, the
eigenvectors can be very different,
but the eigenvalues are the same.
So now, let's take a look at this
matrix. You look at the rows, and you
say oh, this row adds up to 4, and this
row adds up to 1, and this row adds up
to 1, that doesn't look like there's
anything happening with the rows.
But the columns all add up to the same
thing. This column adds up to 2, this
column adds up to 2, this column adds
up to 2. Since the columns all add up
to 2, well the columns of M are the
rows of M transpose, so all the rows
of M transpose add up to 2, but that
means that 2 has to be an eigenvalue
of M transpose, well that means that 2
has to be an eigenvalue of M. The
general principle is that if you ever
have a situation where all the columns
of the matrix are - have the same sum,
then that sum is an eigenvalue. You can't
write down the eigenvector, there's no
simple formula for the eigenvector,
but the common sum of the columns
is an eigenvalue. Now, it often happens in
probability that you're dealing with
matrices that describe the probability of
a situation tomorrow, as a function of
the probability of what things are today,
and you have that each column adds up
to 1, that's called a probability matrix,
a probability matrix is a matrix where
each entry is non-negative, and the
columns add up to 1, by this principle,
such matrices always have 1 as an
eigenvalue. Okay, so let's combine all
the tricks we've learned so far to figure
out the eigenvalues of this big 5 by 5
matrix. The first thing we want to do,
is we want to partition it. It's block
triangular, so we really only have to find
the eigenvalues of A, and we have to
find the eigenvalues of D, so let's do
these one at a time, let's look at A.
Oops, this is a MINUS 5. The trace
of A is 0. The determinant of A
is 5 times -5, minus 3 times -3, so that's
-25 plus 9, so it's -16, so the
eigenvalues of A have the property
that they add up to 0, and their product
is -16, so that means the eigenvalues of A
have to be 4 and -4, and that means that
the eigenvalues of M have to include
4 and -4. Now, let's take a look at D.
This column adds up to 3, this column
adds up to 3, this column adds up to 3,
so one of the eigenvalues is 3.
Sorry, these are eigenvalues 1 and 2,
and this is gonna be eigenvalue 3.
Now, the trace is 6, and that means
that 3 plus the 4th eigenvalue plus
the 5th eigenvalue has to be 6,
so the fourth eigenvalue plus the
fifth eigenvalue has to be 3. Now,
if you work out the determinant,
you get the determinant of D is
also 6, and that has to be 3 times
lambda 4 times lambda 5, so
lambda 4 times lambda 5 have to be
2, so what are two numbers that
add up to 3 and who's product is
2? Well, 1 and 2. So there we have it.
The eigenvalues of this big 5 by 5
matrix are 4, -4, 3, 1, and 2. Now how
do you find the eigenvectors, well,
if you find the eigenvectors of this
piece, you pad them with 0's and you
get an eigenvector of the whole thing.
If you have the eigenvectors of
this piece, you can't pad them with 0's to
get eigenvectors of the whole thing,
you have to take M minus 3 times the
identity and row reduce it to figure
out this eigenvector, M minus 1 times the
identity and row reduce it to get this
eigenvector, and minus 2 times the
identity and row reduce it to get this
eigenvector. The tricks of the trade
generally help a lot with finding
eigenvalues, but once you've got the
eigenvalues, you still have to work
to get the eigenvectors.
