>>In this video we continue our introduction to
Eigenvalue Theory by undertaking case studies
of eigenvalues associated with
a selection of 2 by 2 matrices.
The idea here is that we're going to follow the
process of mathematical inductive reasoning.
The concept of inductive reasoning
is instead of studying theory first
and then applying general statements to
examples, that's called deductive reasoning.
Inductive reasoning starts by studying a
collection of examples and then finding patterns
to create an idea about theory and
then eventually we'll study the proof.
In this case we're going to study
Eigenvalue Theory for 2 by 2 matrices
by collecting information about various
2 by 2 matrices, and then we're going
to generalize to a more interesting form.
Let's begin with a general overview of how
we're going to conduct these case studies
and what type of information
we're going to be looking for.
For each 2 by 2 matrix that we study,
we're going to analyze a number
of properties about those matrices.
First, we're going to talk about
something called matrix structure.
Specifically for each matrix we're
going to ask ourselves what kind
of structure does that matrix have.
Is it diagonal, is it lower triangular,
upper triangular, is it symmetric,
is it positive definite or
semi-positive definite,
or not positive definite, is it invertible.
Next, we're going to ask ourselves questions
about the algebraic properties of the eigenvalue
and eigenvector information
for each matrix that we study.
We start with the question what are the
eigenvalues and eigenvectors of the matrix
that we're looking at, which means
we actually have to calculate it.
Next, we want to ask how
many eigenvalues are there.
For each eigenvalue how many
eigenvectors can we find
and then what type of scalars
are each eigenvalues.
Are they real numbers or actually they
can be complex even for real matrices.
Are they positive or negative,
are they zero or nonzero,
are any of the eigenvalues
repeated in our analysis.
We'll go much further into this but remember,
we're just getting a general overview.
So for every matrix that we study in our case
study, we're going to study matrix structure,
algebraic properties of eigenvalues,
and also geometric properties
of eigenvalue and eigenvectors.
We're not going to just stay
with that algebraic analysis.
We're going to actually see if we can develop
a geometric interpretation and intuition
about what the eigenvalue and eigenvector
information is telling us associated
with the matrix that we're studying.
Specifically, we ask a specific question, what
type of geometric intuition can we build based
on the eigenvalue and eigenvector
information that we collect.
When we think about that property, that A
x equal lambda x, we might ask ourselves
for different types of input vectors x, what
is the matrix product, A times x represent.
And if indeed we find eigenvectors,
A times x equal lambda x,
how can we interpret that geometrically?
Specifically we're going to kind of focus
on a few different types of geometric ideas.
There's a concept of scaling where
the eigenvalue either stretches
or compresses the original input, but
maybe maintains the same orientation.
We also have a concept of
reflection or flipping orientation
which might happen if the
eigenvalue is negative.
We have this concept of rotation.
So we've seen Givens rotations and
we've seen rotational matrices.
One of the questions we're
going to ask ourselves is, well,
what is the eigenvalue behavior
of rotational transformations.
And then also we have this concept of
projections, which is loss of information.
Every time we look at a specific case
study, we're going to figure out which
of these different geometric interpretations
are at play and how do they relate
to matrix structures and algebraic structures.
I don't want you to worry
about all this right now.
We're going to answer all those
questions for each matrix that we study,
but the concept is we're going to
study a large collection of 2 by 2
and then three by three matrices.
For each collection we're going to
actually analyze matrix structure
as there's special stuff
going on with the matrix.
We're going to calculate eigenvalue and
eigenvector information and then figure
out what the algebraic properties are.
Then we're going to interpret
that information geometrically.
After we've done all three of this type
study, we're going to relate them together
and see if we can notice patterns.
Those patterns we're going to
call later theorems, propositions,
and definitions within Eigenvalue Theory.
Moreover, as we make connections between
the matrix structure, algebraic properties
of Eigen information and geometric
properties of Eigen information,
there are going to be some very, very
famous and powerful theorems that allow us
to describe the inter-connectivity between
these for special types of matrices.
Those theorems we will prove
later on in our introduction.
But for now, I want to prime the pump.
I want to discuss some of the patterns
that are going to arise in the 2 by 2 case
and we'll highlight those in each
individual example that we study.
The good news is that for 2 by 2 matrices, there
are only four possible categories of eigenvalue
and eigenvector information related to
both algebraic and geometric properties.
The first type of information is that we're
going to say that if we have a 2 by 2 matrix,
the first category of eigenvalue
information is going to be a matrix A
that has two distinct real eigenvalues so those
are eigenvalues that are not equal to each other
and there's two of them, and then two
linearly independent eigenvectors.
So each unique eigenvalue has an
associated linearly independent eigenvector.
Algebraically, we're going to write that A
has two eigenvalues, Lambda 1 and Lambda 2.
And then two linearly independent eigenvectors
such as A sub vk is Lambda k sub vk.
The second category of Eigen information that
might show up is if we have a real matrix A
that has one repeated eigenvalue, but
two linearly independent eigenvectors.
That would look like A producing a
single eigenvalue that is repeated twice.
But when we find the eigenvectors, there are
actually two linearly independent vectors
that satisfy our eigenvalue information with A
sub vk equal Lambda times vk for both 1 and 2.
The third type of 2 by 2 matrix that's going
to arise in your case studies is a matrix A
that only has one real eigenvalue
that's repeated
and a single linearly independent eigenvector.
These matrices are called incomplete.
We'll talk about that later.
But the idea is when we're studying the
matrix A, we can produce a single eigenvalue
that only has one eigenvector
for that eigenvalue.
The last category of real
2 by 2 matrix that arises
when we're analyzing eigenvalues is a
real matrix that has no real eigenvalues.
But instead two complex conjugate eigenvalues
and two linearly independent
complex eigenvectors.
Specifically, we're going to see that for every
complex eigenvalue, we have a theorem later
that says the conjugate of that eigenvalue must
also be an eigenvalue of our original matrix.
And then similarly we have that each of the
eigenvectors are complex conjugates together.
Please don't stress out about that too much.
I promise you we're going to go into glorious
detail for each example that we look at
and then take a look at the entire
structure as we build this theory.
The point of this particular
video to set the stage
and help focus your mind
as we do our case studies.
For every single example that we look
at we're going to do three things.
First we're going to analyze matrix
structures, algebraic properties of eigenvalues
and eigenvectors, and then a geometric
interpretation of that information.
Next, we're going to try to categorize
the algebraic and geometric information
and when we do that, we're going to just say
that there are four different
categories for 2 by 2 matrices.
One category is when we have
two distinct real eigenvalues
and two linearly independent eigenvectors.
The second category is when we
have one repeated eigenvalue
but two linearly independent eigenvectors.
The third category is when we have a single
repeated eigenvalue that is a real number.
And then only one linearly
independent eigenvector.
And then the only other option that
we have is two complex eigenvalues
and two complex eigenvectors.
Those complex values will actually
be conjugates of each other.
The last thing we're going to do is relate
our analysis and our categories together.
When we figure out the type of matrix
structures the associated algebraic
and geometric properties and
then we start to categorize them,
we're going to ask ourselves the question,
are there patterns between matrix structures
that arise and the corresponding algebraic
and geometric properties that we notice?
The answer to that question is yes.
If we were only a few hundred years older,
we would be famous for the rest of time
because that's exactly what the
theory is going to tell us about.
With that in mind, for every matrix that we
study, please focus on analyzing, categorizing,
and relating that type of
information so that we can get a sense
of how eigenvalues work in the 2 by 2 case.
We will start that process in the next video.
