Now that we've found the eigenvalues
and eigenvectors of the second derivative
operator for periodic functions, we're
gonna do the first derivative, or more
precisely, minus i times the first
derivative, because we worked out
previously that minus i times the
first derivative is Hermitian, so the
eigenvalues are real, the eigenvectors
are orthogonal, and the eigenvectors
give an orthogonal basis for the entire
vector space.
So let's go to. We want to solve
minus i times the derivative of f
is equal to lambda times f, where
lambda's some real number.
Well that means the derivative
of f must be i lambda times f,
and that means that f must go
as e to the i lambda x.
If we had just a derivative is
lambda times f, we would have
e to the lambda x. If we have
i lambda, we have i lambda x.
So our function is some arbitrary
constant, e to the i lambda x,
and now we're applying periodic
boundary conditions, which means
that the value at 0 and at L have to
be the same, which means that
if you plug in L, you have to get
e to the i L x times c equals c,
in other words, e to the i lambda x -
sorry, e to the i lambda L has to
be 1, but we know that if e to the
i- this is cosine lambda L plus
i sine lambda L, and that's 1 precisely
when the cosine is 1 and the sine is 0,
in other words, when lambda L is a
multiple of 2 pi, so lambda L
has to be a multiple of 2 pi, which
means that lambda has to be a
multiple of 2 pi over L. Okay,
but the key thing to note here is,
in this equation, nothing says that
lambda has to be positive or negative.
n can be any integer, not just positive
integers, any integer, so we have this
whole string of eigenvalues running from
minus infinity to plus infinity,
going by steps of 2 pi over L.
Well that's the eigenvalues, and we have
our eigenfunctions, our eigenfunctions
are e to the i lambda x, so that's e
to the 2 pi i over L, n x.
Okay, and of course that's a cosine
plus i times a sine.
Now we've seen these functions
before, we already saw cosine of 2 pi n x
over L, and sine of 2 pi n x over L when
we were diagonalizing the second
derivative. Here they come in a particular
combination, cosine plus i sine, and if
you look at the minus n eigenfunction,
you get cosine minus i sine.
That is to say, if the eigenfunction from
minus 2 pi n over L is the same cosine
minus the same sine. Cosine plus i
sine has eigenvalue 2 pi n over L,
cosine minus i sine has eigenvalue
minus 2 pi n over L.
Now, remember, we saw the cosine
and the sine as eigenfunctions of the
second derivative, and that's not
a coincidence, because if you take
minus i times the first derivative,
and do it twice, you get minus the
second derivative. In other words,
P squared is minus A, so anything
that's an eigenfunction of P is
automatically an eigenfunction of A,
with eigenvalue minus lambda squared.
And sure enough, 2 pi n over L squared,
or minus 2 pi n over L squared,
gives you 4 pi squared n squared over
L squared, and that's minus the
eigenvalues of A. Okay, now these
eigenfunctions of our operator P
give a nice orthogonal basis for L2,
so here are our eigenfunctions,
e to the 2 pi i n x over L, where n
is any integer positive or negative,
and you can check directly that
they're orthogonal, the inner
product of psi n with psi m,
is this integral, see this, you take
e to the 2 pi i n x over L bar,
is e to the minus 2 pi i n x over L,
you conjugate the i, and that's
e to the 2 pi i m minus n, x over L,
and if m is different from n, you just
do the integral, you get something,
which when you apply it at the
two endpoints gives you the same
value, and so you get 0.
If m is equal to n, it's e to the 0
dx, so you're integrating 1 dx,
and that gives you L. In other words,
the inner product of any vector
with itself gives you L, the inner
product of any vector with a
different eigenvector, gives you 0,
which is to say we have an orthogonal
basis for L2. In the next video, we'll
see what happens when you
decompose functions in this basis,
and that's a Fourier series.
