This is the first of three videos on
solving systems of linear differential
equations, basically the derivative 
of x with respect to time, is some
matrix times the vector.
In this video, we're gonna talk about
what happens when the eigenvalues
are real. Subsequent videos will 
talk about complex eigenvalues
and about non-diagonalizable matrices.
So, here's the kind of system that
we're talking about.
We're looking at differential equations 
and this is a 2x2 system.
Let's keep it simple.
Derivative of the first variable is
3 times the second variable.
Derivative of the second variable is
twice the first, plus the second.
And we'll take initial conditions, 
x_1 is 6 and x_2 = 1.
Now you can rewrite this in matrices,
say the derivative of the vector x_1 x_2
is this matrix times the vector.
And our initial condition
is a vector 6 1.
Now, this may look a little bit familiar,
it should because it behaves just like
a system of difference equations.
We've already studied system of 
difference equations where
x_1 today is 3(x_2) yesterday.
x_2 today is 2(x_1) yesterday,
plus x_2) yesterday.
In those cases, with the same initial
conditions, we had that x at any given
time was a matrix times 
x the previous time.
And we had the exact same
initial conditions.
So what we're gonna do is we're gonna
solve both systems,
the differential equations and the
difference equations, in parallel.
I'm always gonna use red ink for the
differential equation and green ink
for the difference equation.
And the method is absolutely the same.
The first step is that we want to 
diagonalize the matrix.
We need to find our eigenvalues
and our eigenvectors.
And then we let y be the coordinates
of x in the basis of eigenvectors.
That's always gonna be the right
basis to use, then we rewrite our
equation in terms of y. 
We solve those equations.
And then we convert the initial value
of x into an initial value of y into
a final value of y, 
into a final value of x.
That's the procedure that works 
really well for a difference equation.
It'll work just as well for 
a differential equation.
So, step 1 doesn't care about whether
it's difference or differential.
It's diagonalizing the matrix.
The sum of rows is 3, so 3 is an
eigenvalue. The trace is 1,
so -2 is the other eigenvalue.
And the eigenvectors are 1 1,
and 3 -2. So far so good.
So, our basis is 1 1 and 3 -2.
And one of our change of base matrices 
is 1 3 1 -2, and this is the inverse of
that matrix.
Next step is we write y is x 
in the b basis.
So we go from x to y by multiplying
by P_BE. That's this matrix.
And what that means is that at any given 
time, well x can be gotten by the
opposite matrix but that y_1 and y_2
are the coefficients when you expand
x in the B basis.
So x is y_1 times b_1 + y_2 times b_2. 
That's what that means.
Okay.
Now let's rewrite our equations 
in terms of y.
Since x can be expanded in terms of y,
the derivative of x,
I'm using the dot notation, where dot
means the time derivative.
b_1 and b_2 are just vectors so the time
derivatives are just y_1dotb_1 + y2_dotb_2.
Meanwhile Ax is A times y_1b_1 +
A times y_2b_2.
And y_1 is just a constant that
pulls out
and y_2 is constant that pulls out
and b_1 and b_2 are Eigenvectors.
So Ab_1 is just 3b_1 
and Ab_2 is -2b_2.
And since our equations are
that xdot is equal to Ax,
that means that y_1dot has 
to be 3y_1
and that y2dot has to be
equal to -2y_2.
In general if you do this,
for some other matrix
you're always going to get 
that the y_jdot
is the jth Eigenvalue times y_j.
Now when we did this for 
difference equations
it was the same sort of argument.
We said x today is y_1 today 
times b_1 plus y_2 today times b_2.
And x yesterday was the same thing
only we had y_1 yesterday
and y_2 yesterday.
The bs don't change 
and a times x yesterday
was given by y_1 yesterday
times ab_1
plus y_2 yesterday times ab_2.
And that was 3y_1 b_1 -2y_2b_2.
And then we set this equal to this,
these two are equal.
So we said 'oh y_1 today
must be 3y_1 yesterday
y_2 today must -2y_2 yesterday.'
And in general, if you had many
Eigenvalues -- a bigger matrix
the jth y today is the jth Eigenvalue
times the jth y yesterday.
So how do you solve these equations,
here's where things are different.
The solution to y dot equals 3y
is e^3t times the starting value.
The solution to y dot equals
-2y is e^-2t times the starting value.
Over here the solutions were 
3^n times the starting value
and -2^n.
So the only real difference is that 
here you exponentiate
the Eigenvalue times time.
Here you take the nth power
of the Eigenvalue.
e^3t versus 3^n.
e^-2t versus -2^n.
And then we go around our square.
Here's how we go around a square
for the differential equations.
We start off with initial values, 6, 1
we multiply by p inverse 
and we get 3,1.
Then we multiply the first coefficient
by e^3t and the second coefficient
by e^-2t so that's multiplying
by e^dt and then we multiply
by the matrix p.
So we take the first y1
times the first eigenvector
plus y^2 times the
second Eigenvector
and we multiply it all out
and we get a solution.
Now if we'd done it for the difference
equation, we would have just said
6,1 goes to 3,1
by p inverse.
Exactly the same as for the
differential equation.
The only difference is that
on this stage, you're not multiplying
by e^dt you're multiplying
by d^n so you get 3 * 3^n
and 1 * -2^n
And then over here you multiply
by p to get the solutions.
This is exactly the same as a solution to
the differential equation only with
3^n instead of e^3t with -2^n
instead of e^-2t
There's the differential solution,
there's the different solution.
Okay.
And if you want to write these
solutions in closed form,
they look very much the same.
For the differential equation, the
solution is e^at times the initial value.
And how do you get e^at?
Well it's Pe^dt p^-1 * initial value.
P inverse times the initial
value of x is the initial value of y.
e^Dt times the initial
value of y is the
final value of y and
p times that is the final value of x.
For the difference equation
we had a^n instead of e^at
we got that as PD^np^-1.
The same p^-1,
the same y(0).
D^n instead of e^dt,
slightly different y(n)
and then we multiply by p to come home.
Finally the general solution to the
differential equation.
If I didn't give you initial conditions,
but instead just gave you,
said, what's the general solution?
You would have said it's a sum of terms
and each term is e^λt * an eigenvector.
And you have arbitrary coefficients
and these coefficients are really
y 1 and 0,
y2 with 0,
y_m of 0.
So this is the most general solution
to the differential equation.
The most general solution to
the difference equation is
the exact same thing except instead
of e^λt you have λ^n.
That's it!
