This is the third in our series of
videos about techniques for
diagonalizing matrices.
In this one video, we're gonna talk
about block diagonal and
block triangular matrices.
A block triangular matrix is something
of this form. We can divide into blocks,
and this is what we call upper block
triangular. This is what we call
lower block triangular.
There's a square block and either only
the upper part counts or only
the lower part counts. 
The rest is 0.
If this is what we call block diagonal,
there's an upper left block and a lower
right block and everything else is 0.
Let's make that a little bit more precise.
In general, we're gonna look at 
partition matrices.
We want a partition matrix where
A and D are square and
B and C can be rectangular.
A block is block upper triangle
if C is 0, if A, B, and D are the
only things that contribute.
It's block lower triangular if B is 0,
if A, C, and D are the only things
that contribute.
It's block diagonal if B is 0
and C is 0.
Now, what's so great about all these
cases is that in all of these cases,
the problem of finding eigenvalues of
this big matrix is reduced to studying
the matrix A and the matrix D.
In all of the cases, the eigenvalues of 
the big matrix are just the eigenvalues
of A together with the eigenvalues of D.
That's a lot simpler, small matrices are
easier to handle than big matrices.
The characteristic polynomial of 
the big matrix is the product of the
characteristic polynomial of A and D.
If lambda is an eigenvalue of A,
then P_A of lambda is 0.
That makes P_M of lambda 0.
If lambda is an eigenvalue of D,
then P_D of lambda is 0 and so
P_M of lambda is 0.
Let's walk through the cases one a a time
and see why this works.
The simplest case by far is when you have
things that are block diagonal.
So B and C are 0 and we just have an A in
the upper left and a D in the lower right.
Then if we try to compute the 
characteristic polynomial,
lambdaI - M is you're gonna get the
identity minus A here and the identity
minus D here.
These identities are different sizes.
If this is a three by three block and
this is a five by five block, 
then this is a three by three identity,
this is a five by five identity,
this is an eight by eight identity,
and I'm not going to write subscripts
to tell you the size of the identities.
When you take the determinant of this,
the determinant of the left hand side,
that's the characteristic polynomial M, is
this determinant times this determinant.
This determinant is the characteristic
polynomial of A and this determinant
is the characteristic polynomial of D.
There we have our factorization,
the characteristic polynomial,
and that's what tells you the eigenvalues
of A and D are eigenvalues of M.
But it's more than eigenvalues,
it's also eigenvectors.
If you have an eigenvector of A,
then you can pad it with 0's to get
an eigenvector of M.
If you multiply A 0 0 D by x 0,
you get A x upstairs and you get
D * 0 downstairs, and that just gives
you lambda(x 0), which is
lambda * (x 0).
Great, it's an eigenvector, 
and likewise,
if you have an eigenvector of D, 
we'll call the eigenvalue mu instead
of lambda just because we've already
used lambda.
Then you pad it upstairs and A 0 0 D times
0 y gives you 0 Dy and that's 0 mu y
which is mu * 0 y.
So block diagonal is really simple,
the eigenvalues of M are the
eigenvalues of A and D.
The eigenvectors of M are easily built
from the eigenvectors of A and D.
Now block triangular is a little - 
oh sorry, let's work an example.
In this example, we have a five by five
matrix but we break it down into a
two by two matrix and 
a three by three matrix.
The two by two matrix is of a form
we recognize, it's our favorite matrix
with eigenvalues 3 and 1, and it's
eigenvectors are 1 1, 1 -1.
We've also seen the three by three 
matrix before.
Its eigenvalues are 2, -1, and -1, 
which is to say they're 2 and -1
but -1 is a double root.
The eigenvectors are 1 1 1, -1 1 0,
and -1 0 1.
That means that the eigenvalues of
the whole matrix are these eigenvalues
and these eigenvalues.
And you get the eigenvectors of the
whole matrix, you get two of them
by taking the eigenvectors of A 
and padding them with 0's,
and you get the other three by taking
the eigenvectors of D and padding them
on the top with 0's.
So block diagonal is pretty simple.
Block triangular is a little bit tougher.
If something is block triangular, then
you still have the factorization of the
characteristic polynomial. 
See, if you have a matrix like this,
it's still true that the determinant of 
this big matrix is the determinant of
this corner times the determinant
of this corner, and that's enough to
get you the characteristic 
polynomial of M,
is the characteristic polynomial of A
times that of D.
What's more is if you have eigenvectors
of A, you can still pad them with 0's
to get eigenvectors of the big matrix.
See, at the top, you get Ax plus B0,
which is Ax.
And at the bottom, you get 0x + D0,
and that's 0.
However, you cannot pad things
the other way.
If you find an eigenvector of D,
then 0 followed by that eigenvector is
typically not an eigenvector of M.
Because if you multiply it out, at the top
you get A times 0 + B times y,
that's not 0. That's not a multiple of 0.
And you get 0 times 0 + D times y,
that part works, but this is not
a multiple of 0y because the top 
term is By.
So the B gets in the way of 
extending the eigenvector.
The eigenvectors of A extend nicely,
the eigenvectors of D don't.
The eigenvalues are fine, 
but if you want to find the eigenvectors
you have to sweat. You have to actually
write M - lambda times the identity,
row reduce and do all that work 
to get the eigenvectors.
For example, this is an upper triangular
matrix. It's got an upper left block,
which is 2 3 3 2, and it's got a lower 
right block which is just the number 4.
It's a one by one matrix.
So from the 2 3 3 2, we've seen that
whenever you have a matrix of form
A B B A, you get eigenvalues that are
A + or - B. So that's 2 + 3 and 2 -3.
And the eigenvectors are 
1 1, and 1 -1.
The one by one matrix, well the 
eigenvector is 1 and the eigenvalue
is the number in it.
It's already diagonal.
So if you take the eigenvectors are the 
top matrix and pad them with 0's,
you get eigenvectors of the 
three by three. You multiply this
by 1 1 0, you get 2 + 3 + 0, that's 5.
3 + 2 + 0, that's 5.
0 + 0 + 0, that's 0,
it works.
It's 5(1 1 0), and likewise if you plug
in 1 -1 0, you get an eigenvector with
eigenvalue -1.
However, if you try multiplying by 0 0 1,
2 3 5, 3 2 7, 0 0 4, you multiply that
by 0 0 1, take the eigenvector of D and
pad it upwards, you get 5 7 4,
which is not 4(0 0 1).
So if you have want to find the 
eigenvector with eigenvalue 4,
and there is one,
you have to go ahead and you have
to write down the matrix minus 4
times the identity.
Then you have to go through the whole
long song and dance of row reduction
and you get these really ugly fractions,
and you discover that the eigenvector
is -31/5 -22/5 1. Much harder.
It can be done but it takes work.
If something is blocked lower triangular,
it works the same as block upper triangular
except the roles of A and D are reversed.
You still have that the eigenvalues
of M are the eigenvalues of A
times the eigenvalues of D.
And if you have an eigenvector of D,
now you can pad it upwards.
But if you have an eigenvector of A,
you can not pad it downwards.
The eigenvalues of A or eigenvalues
of M, but you have to work hard
to figure out what the eigenvectors are.
