Hmm.  Let's look at the determinant a
little bit more.   We look at the
determinant of this 2 by 2 matrix and we
think of this as column a_0 and column a_1.
These are vectors then the determinant
turns out to be the area of the
following parallelogram. If you take
this and you take your vector a_0 and you
root it at the origin.  And you take the
vector a_1 and you do the same thing
there.  And then you look at the
parallelogram that has those vectors as
its sides.  Then the area of the
parallelogram is actually equal to the
absolute value of the determinant.
Hmm, interesting.
Notice that if the determinant is equal to
0, then that means that there is no area
that's in this parallelogram.  And what
that means is that a_0 and a_1 point in
the same direction and therefore are
linearly dependent.  And a whole bunch of
stuff sorta relates to things we've seen
before.   Alright?  Now similarly if you go to a matrix A
that has three columns, then the
determinant is the volume of whatever
the natural extension of a parallelogram
is I'm not very good at these things.  Okay?
And if you go to four dimensions, you get
the natural extension to four dimensions,
whatever that looks like.  All of that is
all very nice and all of that allows you
to come up with a formula that computes
the determinant but none of that is
particularly important we're going to
find out.  The only thing that is
important is that if A is an m by m
matrix then the determinant of lambda I
minus A looks like lambda to the m power
plus some constant times lambda to the m
minus first power and so forth in other
words it is a polynomial of degree m.  Now
this tells us all kinds of interesting
things. In particular, it tells us that a
matrix A has at least one eigenvalue.  Why, because an m degree polynomial has at
least one root. Okay, it also tells us
that our matrix A has at most m distinct
eigenvalues. Why? Because an mth degree polynomial has at most m distinct roots.
Okay, so there are all these new
parallels between properties of
eigenvalues, or properties of the eigenvalues of the matrix, and the roots of a
polynomial.  Here's another one.  If -- and
this is something we probably should
prove but you're just gonna take my word
for it because (it's not) the details
aren't particularly important -- but if a
has real-valued entries, then it's
polynomial has real-valued coefficients
and we know that if a polynomial has
real-valued coefficients then its roots
have to come in conjugate pairs.
And therefore we know that if A is real
valued then its eigenvalues inherently
have to come in conjugate pairs.  So the
characteristic polynomial is extremely
important because it allows us to borrow
all kinds of results from what we know
about polynomials.
But that's about the extent of it.
