In this segment, we are going to talk about
what the definition of eigenvalues and eigenvectors
is. So if we have [A] is a square matrix,
n by n matrix, then we look at our vector
[X] and we say it's not equal to zero is an
eigenvector of [A] if [A][X]= "lambda"[X] so if
you have a square matrix and you find out
that hey there is a vector a column vector
which is not a zero vector and if that satisfies
this particular condition here that [A][X]=
"lambda"[X] then [X] is called the eigenvector
of [A] and "lambda" is the eigenvalues and of course
"lambda" is a scalar, so it's just a number. So
that's how we define eigenvalues and eigenvectors
so all you ever do is you have to find a vector
a column vector, which is non-zero, so that
when you multiply it to the [A] matrix that
it turns out to be some number times the eigenvector
itself or this vector, column vector, [X]
itself and whatever that scalar is by which
you are multiplying it so that this equality
is held good is called an eigenvalue so "lambda" is
the eigenvalue so you can get different eigenvalues
for an n by n matrix so if you have an n by
n matrix you'll get n eigenvalues and corresponding
each eigenvalue you have an eigenvector so
an n by n matrix has n eigenvalues, which
we may call "lambda"1, "lambda"2, all the way up to
"lambda"n they are not necessarily going to be
unique but we will have n different eigenvalues
and corresponding to each eigenvalue you will
have an eigenvector so those are things which
we have to think about when we talk about
what it means for a particular square matrix
to have eigenvalues and eigenvectors. And
that's the end of this segment.
