GILBERT STRANG: OK.
We're coming to the point
where we need matrices.
That's the point when we
have several equations,
several differential
equations instead of just one.
And it's a matrix that
does that coupling.
So can I-- this won't be a
full course in linear algebra.
That would be
available, you may know
on, open courseware for 18.06.
That's the linear
algebra course.
But [INAUDIBLE] facts,
and why not just
say them here in a few minutes?
So I have a matrix.
Well there's a matrix.
That's a 3 by 3 matrix.
And first I want to ask how
does it multiply a vector.
So there it is multiplying
a vector, v1, v2, v3.
And what's the result, key idea?
It takes the answer
on the right-hand side
is this number v1, times that
column, plus this number,
that number times the second
column, plus the third number,
the third number times
the third column,
combination of the columns of a.
That's what a times v is.
That's what the notation of
matrix multiplication produces.
That's really basic to see it
as a combination of columns.
Now I want to build on that.
That's one particular, if
you give me v1, v2, and v3,
I know how to multiply it.
I take the combination.
Now I would like you to
think about the result
from all v1, v2, and v3.
If I take all those numbers, and
I get a whole lot of answers.
They're all vectors,
the result of A times v
is another vector,
Av, And I want
to think about Av, those
outputs, for all inputs v.
So I take v1, v2, v3 to
be [AUDIO OUT] numbers.
And I get all combinations
of those three columns.
And usually I would get the
whole 3-dimensional space.
Usually I can
produce any vector,
any output b1, b2, b3
from A times v. But not
for this matrix,
not for this matrix.
Because this matrix is,
you could say, deficient.
That third column
there, 2, 3, 3,
is obviously the sum of
columns one and column two.
So this v3 times
that third column
just produces something
that I could already get
from column one and column two.
That v3 times that column
three, I could x out.
That's the same as column
one, plus column two
for this matrix, not usually.
And then so I only really have
a combination of two columns.
It's a combination of three.
But the third one was
dependent on the others.
And it's really a
combination of two columns.
So combinations of two
columns, two vectors
in 3-dimensional
space produce a plane.
I only get a plane.
I don't get all of 3-dimensional
space, only a plane.
And I call that plane
the column space,
so the column space
of the matrix.
So if you gave me
a different matrix,
if you change this 3 to an
11, probably the column space
now changes to--
for that matrix I
think the column space would be
the whole 3-dimensional space.
I get everything.
But when this third column is
this the sum of the first two
columns, it's not
giving me anything new.
And the column space
is only a plane.
And you can think of a matrix
where the column space is only
a line, just one
independent column.
OK.
So that, we thought about this.
[AUDIO OUT] is all
combinations of the columns.
In other words, it's
all the results,
all the outputs from A times
v. It's all the outputs
from A times v. Those are the
combinations of the columns.
So we can answer the most basic
question of linear algebra.
When does Av equal b?
Have [AUDIO OUT].
When is there a v so
that I can solve this?
When is there a v that
solves this equation?
So it's a question about b.
What is it about b that must
be true if this can be solved?
Well this says that
equation is saying
b is a combination
of the columns of a.
So this has a
solution when b must
be-- shall I say must
be in the column space.
For that example,
only b's that where
we can get a solution on
b's that are combinations
of the first two columns.
Because having the third
column at our disposal
gives us no help.
It doesn't give us anything new.
[AUDIO OUT]
It will be solvable
if b equalled 1, 1, 1.
That's a combination of the
column, or if b equals 1, 2, 2.
That's another simple
combination of the columns.
Or if b equals 2, 3, 3.
But I'm only, I'm
staying on a plane there.
And most b's are off that plane.
Now when there is a solution.
All right.
Now a second key idea
of linear algebra,
can we do it in
this short video?
I want to know about the
equation Av equals 0.
So now I'm setting the
right-hand side to be 0.
That's the 0 vector, 0, 0, 0.
Does it have a solution?
Does it have a solution?
Let's take this example.
1, 1, 1; 1, 2, 2; 2, 3, 3; now
I'm looking at the solutions
when the right side is all 0.
Does that have a solution?
Is there a combination of those
three columns that gives 0?
Well there is always
one combination.
I could take 0, 0, and 0.
I could take nothing,
0 of everything.
0 of this column, 0 of that
column, 0 of the third column,
would give me to
the 0 [AUDIO OUT].
That solution is
always available.
The big question is, is
there another solution.
And here for this deficient,
singular, non-invertible
matrix, there is.
There is another solution.
Let me just write it down.
Let me put it in there.
Do you see what the solution is?
The third column is
the sum of those two.
So if I want one of that
column, I should take minus 1
in other column.
So this is minus this
column, minus this column,
plus this column
gives me the 0 column.
That is a vector
in the null space.
That's a solution to
Avn equals [AUDIO OUT].
So the null space is all
solutions to Av equals 0.
It's all the v's.
The null space is
a bunch of v's.
The column space
was a bunch of b's.
It's just going to just
emphasize that difference.
I was looking at
which b [AUDIO OUT].
I wasn't paying attention to
what that solution was, just
is there a solution.
Then that b is in
the column space.
I take b equals 0.
I fixed that all important b.
And now I'm looking
at the solutions.
And here I find one.
Can you find any more solutions?
I think minus 10, minus 10, and
10 would be another solution.
It's 10 times as much.
And 0, 0, 0 is solution.
[AUDIO OUT] line of solutions.
We had a plane for
the column space.
But we have a line
for the null space.
Isn't that neat?
One's a plane, one's
a line, dimension two
plus dimension one.
Two for the plane,
one for the line,
adds to dimension three, the
dimension of the whole space.
OK.
That's a little going at in.
All right.
Now I ask, what
our all solutions?
Complete solution
to Av equals, well
let me choose some right-hand
side where there is a solution.
Let me choose a
right-hand side, say
if I add that column
and that column,
I'll get Av-- maybe I'll
take two of that column
plus one of that column.
Two of the first column
with one of the second
would be 3, 2 plus
that would be a 4,
2 plus that would be another 4.
OK.
That's my b.
It's a combination
of the columns.
You saw me create it from
the first two columns.
So now I ask, what
are all the solutions?
It's in the column space.
It's 2 times the first column,
plus the second column.
But there may be
other solutions.
So all solutions, a complete
solution, v complete
is here's the key idea.
And the point is that
it's the same that we know
from differential equations.
It's particular solution
plus any null solution.
Plus all, you can
say all v null.
Particular plus null solution.
It's such an important concept
we just want to see it again.
One particular
solution with that
thing would be
particular, v particular
could be-- 2-- how
did we produce that?
Out of two these, plus one
of these, plus zero of that.
So v particular
could be 2, 1, 0.
It works for that particular
b, two of the first column,
one of the second.
Now then we could add in
anything in the null solution.
So we have infinitely
many solutions here.
We've got one solution
plus added to that,
a whole line of solutions.
This, all the null space,
would be all vectors like that.
OK.
That's the picture
that we've seen
for differential equations.
And I just want to bring it
out again for matrix equations,
using the language
of linear algebra.
That's what I'm
introducing here.
I have one particular
solution, plus anything
in the null [AUDIO OUT]
space of vectors that
is the heart of linear algebra.
Thank you.
