The second half of this week we're going
to spend on discussing practical
algorithms for finding eigenvectors and
eigenvalues.
Okay?  So let's reiterate.  We have a matrix A.  It is our desire to find eigenvalues
and corresponding eigenvectors of that
matrix.  We're going to assume that we
indexed these eigenvalues in such a way
that lambda_0 is the one that's largest
in magnitude.  In magnitude is greater
than the magnitude of lambda_1.  And then
how the rest of them stack up is
actually not that important. Okay?  We're
going to assume that our matrix is
diagonalizable. So we have m linearly
independent eigenvectors, if A is an m by
m matrix.  And we can write it either like
that or like this.  Now an algorithm that
doesn't compute the eigenvector
associated with the largest eigenvalue
but that converges to -- gets arbitrarily
close to -- a vector in the correct
direction, can be given very very simply.
you
all we do is we take an initial vector V
and we hit that vector with the matrix a
repeatedly every time creating a new
vector of course in practice we can
overwrite the original vector but that's
a minor poison okay under these
circumstances this vector will
eventually point essentially in the
direction of an eigen vector associated
with the eigenvalue that is largest in
magnitude this method is known as the
power method remarkably simple
