Earlier in the course we introduced you
to the Singular Value Decomposition. Now
for the algebraic eigenvalue problem, the
Schur Decomposition is sort of the
equivalent of the Singular Value Decomposition. It is just as big of a
result in linear algebra.  Now what is
that result?  Well, it says that given any
matrix A, any n by n matrix A, you can
always find the unitary matrix, Q and
an upper triangular matrix U such that A
can be written as the product of Q times
that upper triangular matrix times the
Hermitian transpose of Q, which of course
is the inverse of Q.  Okay?  This turns out
to be huge.  We love unitary matrices.
Alright?  So what it really means is that
any matrix A is similar to an upper
triangular matrix.  And we saw earlier
that finding the eigenvalues of an
upper triangular matrix is easy.  You just
read them off on the diagonal.  So, this is,
this is good news.  How do we prove this?
Well, given any matrix A, it's at least a
one by one matrix -- I guess we have
encountered empty matrices in this
course  but let's not go there. Given any matrix A we know that it has
at least one eigenvalue.  And therefore we
know that we can find a vector x that is
a nonzero vector such that A times x is
equal to lambda x.  And we actually can do
slightly better.  We can say, "Let's take
that vector of length 1 and let's
call it q_1,
because we can always take our vector x
and normalize it to be of length 1."  And
we just get another eigenvector associated with lambda.  Now what we can
then do is do exactly the kind of trick
that we did when we talked about the
Singular Value Decomposition.  And we can take that vector q make a the first column
in a unitary matrix, where we will call
the rest of that matrix Q_2.  Okay? Now
if we hit matrix A on the left with
the Hermitian transpose of that matrix
and on the right with the matrix
itself.  Well that's sort of like what we
want to do here, except that they bring Q
Hermitian transpose to this side and Q to
that side. Alright?  This is in the right
direction.  If we can show that the result
of this is upper triangular, we'd be done.
But, you know, we're not that lucky yet.
We're not that good at just magically
picking Q_2. Hmm, this here is equal to
q_1 Hermitian transpose Q_2 Hermitian
transpose times matrix A.  And let's
actually go ahead and multiply this out
and this would be A times q_1 and then A
times Q_2.  Okay?  And we know how to do partition matrix-matrix multiplication
this is just an outer product with
partition matrices and therefore this is
equal to q_1 Hermitian transpose A q_1,
q_1 Hermitian transpose A Q_2, Q_2 Hermitian transpose A q_1, and Q_2 Hermitian
A times Q_2.  Now A times q_1  is just lambda times q_1 and q_1 Hermitian transpose times q_1,
the dot product of q_1 with itself, is
just the square of its length.  But its
length we chose to be 1.  So this here
becomes lambda. Okay? Similarly this right
here becomes lambda times q_1, but a
scalar we can always bring to the front.
And notice that we chose all these
columns in Q_2 to be orthonormal to q_1,
or orthogonal to q_1, and therefore this
right here is 0.  And therefore we end up
with a 0 vector right here.  Now this is
just some vector.  We can call it whatever
we want to, w transpose, some row.  And this is just some matrix.  And then you can see
that maybe we can move forward with  this process and make this
submatrix right here eventually upper triangular.  And through
proof by induction, you would then get
there.  And just like when we talked about
the Singular Value Decomposition and we
accumulated unitary matrices, you can
accumulate the unitary matrices into the
final Q that we really wanted.  Okay.  So this
is not the actual Q that we wanted, 
probably should have put little checks on
here, just to distinct which these qs from the final Q
we're after.  And, bingo here we have the
Schur Decomposition Theorem.
