In this video, we're going to see how to
diagonalize an operator,
the second derivative operator,
on functions on an interval.
Now, whenever you deal with a derivative
operator, you need boundary conditions
so we're going to apply Dirichlet
boundary conditions,
which is to say that we're interested
in functions that vanish at
both ends of our interval. So our
vector space is square integrable
functions on the interval from 0 to L,
we want f(0) to always be zero,
we want f(L) to always be zero.
So, our operator is d^2/dx^2 and we
want to find its eigenvalues
and its eigenvectors.
The first thing to notice is that the
first derivative operator is
anti-Hermitian, the adjoint of the
first derivative is minus
the first derivative, and the way to
see that is we just see what happens
when we take the inner product
of f with the derivative of g.
By definition, that's the integral of
f bar times g' dx, and we can
do that integral by parts.
This is our v, this is our du, or if you
prefer, this is u and this is dv,
so you get uv minus
the integral of v du.
Now, because of our boundary conditions,
this term is zero. g(0) is zero,
g(L) is zero, f(0) is zero, f(L) is zero.
So we don't need this term.
And this term is exactly minus the inner
product of the derivative of f with g.
So minus the derivative applied to f
has the same effect as the derivative
applied to g, so the adjoint of the
derivative is the minus derivative.
Now, that tells us what the adjoint
of our operator A is,
A is a second derivative. In other
words, the derivative of the derivative.
And the adjoint of the product of
two operators is the product
of the adjoints in the opposite order.
So instead of d/dx and a d/dx,
you have a d/dx and a d/dx,
so it's minus d/dx times minus d/dx,
which is d^2/dx^2, which is A again.
So the operator that we're studying,
the second derivative operator,
is Hermitian. So what does
that buy us?
Well, just like on finite dimensional
spaces, Hermitian operators
have real eigenvalues, their eigenvectors
are orthogonal, and it's always
diagonalizable. Now I have to be
a little bit careful about
what diagonalizable means, it has
a spectral decomposition
which is the infinite dimensional analog
of diagonalizable.
For our purposes, let's just say
it's diagonalizable.
And so we need to find all the
eigenvalues and the eigenvectors
and we only need to check real eigenvalues
because it's Hermitian.
So we want to solve the equation Af=λf
with the boundary conditions that we set.
Now, we only have to consider real
eigenvalues, we're going to consider
our positive ones, negatives one,
and zero.
So first, let's look for some positive
eigenvalues.
So we want Af, that's the second
derivative of f, to be a positive
multiple of f. Well, we know the
solution to that equation,
it's cosh and sinh. It involves a
square root of the eigenvalue
but it's cosh and sinh. Okay, we
studied this equation using
t's instead of x's, but it's
the same equation.
And then we plug in our boundary
conditions. Since f(0) is zero,
c_1 times the cosh(0) plus c_2
times the sinh(0) is 0,
cosh(0) is one, sinh(0) is zero,
so c_1 is zero, so this term is gone.
And then we plug in our boundary
condition at the other end.
We get that c_2 times the sinh of root λ
L is zero, well that means c_2 is zero.
So this term is gone. So the only
solution to second derivative equals λf
is if f=0, and that doesn't count
as an eigenvector.
So there are no positive solutions
to the system of equations,
so there are no positive eigenvalues.
Next we check for zero eigenvalues.
We have to solve the equation Af=0f.
Well, the second derivative is 0, and
the first derivative must be a constant,
and the function must be a constant
plus another constant times x.
We use the same reasoning 
with the positives.
The value at 0 tells you
that c_1 is zero,
and then the fact that it's zero at the
other end tells you that c_2 is zero.
So the only solution here is f=0,
it doesn't work.
No non trivial solutions,
zero is not an eigenvalue.
So what we're left with is looking
for negative eigenvalues.
So we want to look for eigenvalues
of the form minus omega squared.
And these work. You see, the solution
to second derivative is minus omega
squared f, well that's cosines and
sines of omega x.
Since 0 is f of 0, well that tells you
that c_1 is 0 as before, so this is
gone, and our function has to
be c_2 sine of omega x.
But now when you say that 0
is f of L, you don't have to set
c_2 equals 0, you could have
sine of omega L equals 0,
and if sine of omega L is 0, then
omega L has to be a multiple
of pi, which means omega has to be
a multiple of pi over L, so they're
our eigenvalues. Our eigenvalues
are minus omega squared,
so that's minus n squared pi squared
over L squared, and our eigenvectors
are sine of n pi x over L. We've seen
this before, this is Fourier series.
The eigenvectors of the second derivative
operator are exactly the Fourier modes,
and when we decompose a function in
a Fourier series, we're just decomposing
in a basis of eigenvectors. And how do
we know that it really forms a basis?
Well, it's because it's a Hermitian
operator, Hermitian operators are
diagonalizable.
Okay, so let's sum up what we know.
The eigenvalues of our operator
are minus n squared pi squared over
L squared.
The eigenvectors are sine of n pi x
over L. I say vectors, sometimes we
call them eigenfunctions, because
they're functions of x, but they
live in a vector space, the vector space
L2. The eigenvalues are real, and the
eigenvectors are orthogonal, because
our operator is Hermitian, now we
checked that these were orthogonal
by hand before, but they've gotta
be orthogonal because A is Hermitian,
and the eigenvectors form a basis.
Now, Fourier series is really good for
problems involving A, because then
we're decomposing in a basis of
eigenvectors. Two examples we're
gonna study in future videos, one is
the heat equation. The heat equation
says the first partial derivative with
respect to t is a constant times Ax,
times the second derivative with
respect to x, and you should think
of that as being just like the
equation dx dt equals Ax, except
now it's not x, it's a function, and
we're in an infinite-dimensional
space, but it works the same way,
you find the eigenvalues and
eigenvectors, and you write down
the solution. The wave equation
is the same thing, only with the
second derivative with respect to
t, and that behaves just like
the second derivative of x with
respect to t being a matrix times
x. If you understand these problems,
which we did in section 5.1 and
section 5. - section 5.2 and 5.3,
then you understand these infinite-
dimensional problems, and that's the
glory of Hermitian operators, second
derivative operator, and Fourier series.
