The second half of this week we're going
to spend on discussing practical
algorithms for finding eigenvectors and
eigenvalues.
Okay? So let's reiterate we have a matrix
A.  It is our desire to find eigenvalues
and corresponding eigenvectors of that
matrix.  We're going to assume that we
indexed these eigenvalues in such a way
that lambda_0 is the one that's largest
in magnitude.  In magnitude it's greater
than the magnitude of lambda_1.  And then
how the rest of them stack up is actually not that important.  Okay?  We're
going to assume that our matrix is
diagonalizable.  So we have m linearly
independent eigenvectors if A is an m by
m matrix.  And we can write it either like
that or like this.  Now an algorithm that
doesn't compute the eigenvector
associated with the largest eigenvalue
but that converges to, gets arbitrarily
close to, a vector in the correct
direction can be given very very simply.
Okay, all we do is we take an initial
vector v and we hit that vector with the
matrix A repeatedly, every time creating
a new vector.  Of course, in practice, we
can overwrite the original vector, but
that's a minor point.  Okay?  Under these
circumstances, this vector will
eventually point essentially in the
direction of an eigenvector associated
the eigenvalue that is largest in
magnitude.  This method is known as the
power method.  Remarkably simple!
