Next on the agenda, Hermitian Operators.
An operator on a vector space,
an interproduct space is called Hermitian,
if it equals to its own adjoint.
And likewise if we have a matrix, we call
the matrix Hermitian if it equals its own
transpose conjugate because if you
think of a matrix as an operator on
C^n, then its adjoint is its transpose
conjugate.
So why should we care about Hermitian
operators?
Well, for one, they are extremely 
important in quantum mechanics.
In quantum mechanics, every physically
observable quantity is described by
Hermitian operator. Second, even if you
work purely in the world of math,
Hermitian operators have some wonderful
properties. First of all, if we are
working on a finite dimensional space, 
then all of eigenvalues are real.
And all of eigenvectors with different
eigenvalues are orthogonal.
And it's automatically diagonalizable.
We don't have to worry about
complex eigenvalues. We can always get
ourselves a nice orthonormal basis.
And we don't have to worry about
power factors. It's always diagonalizable.
We can make ourselves this wonderful
orthogonal bases. We've seen how useful
orthogonal basis can be, consisting 
only of eigenvectors. We love bases of
eigenvectors. We love orthogonal
eigenvectors. We love orthogonal bases.
If the operator is Hermitian, we can
have both at the same time.
So let's look at some examples.
Our favorite matrix 2 1 1 2.
It's Hermitian. You take its transpose
and you get the same thing.
Take its conjugate you get the same
thing because it's a real matrix.
And its eigenvalues are 3 and 1.
They are real eigenvalues.
And you look at the eigenvectors.
You will discover the eigenvectors
are 1 1 and 1 -1 and the interproduct
of 1 1 with 1 -1 is 0. So orthogonal
eigenvectors. Real eigenvalues.
Great!
And there is our orthogonal basis 
for R2.
Next let's look at a slightly different
matrix. Got the same 2 1 1 2 in
the corner. And it's got a 3 downstairs.
Once again the eigenvalues are 3 and 1.
That's real. So far so good.
Right now, you notice that
0 0 1 is an eigenvector. So is 1 1 1.
They are both eigenvectors with
the same eigenvalue. And they are
not orthogonal.
That's not a contradiction to our
theorem. Our theorem said
that eigenvectors with different
eigenvalues have to be orthogonal.
And in fact the eigenvectors with
eigenvalue 1 is orthogonal to 0 0 1.
It's also orthogonal to 1 1 1.
So our theorem is satisfied.
Now I said we could get a basis of
eigenvectors. How do we do that?
Well we look within the E_3 eigenspace.
And we just take the vectors we had
and we apply Gram-Schmidt to them.
And instead of using 0 0 1 and 1 1 1,
Gram-Schmidt turns them into 0 0 1
and 1 1 0. These are both still
eigenvectors of eigenvalue 3. 
And they are orthogonal to each other.
And they are orthogonal to the eigenvector
with eigenvalue 1. So lo and behold,
these three vectors are all forming
an orthogonal set. And we've got
an orthogonal basis for R3.
Finally, let's look at the matrix 
2 i -i 2. It's a complex matrix.
So you might expect its eigenvalues
to be complex. But they are not.
Because this matrix is Hermitian.
You take the transpose conjugate.
The negative i goes here and then
gets conjugated to become an i.
The i goes here and then gets
conjugated to become a -1.
This is a Hermitian matrix.
In fact, its eigenvalues are 3 and 1.
And its eigenvectors are i 1 and -i 1.
And those are in fact, orthogonal.
The interproduct of i 1 with -i 1 is
i-bar  -i + 1-bar  1, which is
-1 + 1, which is 0.
Okay.
So why should a Hermitian operator
have real eigenvalues?
Well, if x is an eigenvector with
eigenvalue λ, we can always
rescale it to have length 1. 
And then we compute.
λ is λ times the interproduct of x
with itself because x was a unit
vector. And we can put the λ
inside the set because the interproduct
is linear. Then we replace λx with
Lx because after all, Lx is λx.
And then instead of applying L to 
the right, we apply L† to the left
because that's the definition of
an adjoint. It's the thing that
applied on the left gives you 
the same answer as L applied
to the right. But L = L† .
And Lx is just λx. So now we've
got the λ on the left hand side.
And when you pull things out of
the left hand side, you have to
conjugate them.
And now the interproduct of x
with itself is 1.
And we have λ-bar.
So λ by this whole chain,
is equal to λ=bar. What that means 
that λ had to be real.
So every eigenvalue of an orthogonal,
sorry, every eigenvalue of a Hermitian
operator is automatically real.
As for the eigenvectors, let's suppose
that x and y are eigenvectors with
different eigenvalues. x with eigenvalue
λ_1, y with eigenvalue λ_2.
If you take λ_1 times the interproduct
of x with y, that's the same thing
as λ_1, xy. Really should be λ_1-bar
but λ_1 is real so doesn't make
a difference.
That's Lx interproduct with y.
But that's the same thing as
L† x because L and L† are the same.
But L† applied to x is the same thing
as L applied to y.
And L applied to y gives you a
factor of λ_2.
So you get λ_2 times the interproduct
of x with y. So λ_1 times the interproduct
of x with y is the same thing as λ_2
times the interproduct.
So (λ_1 - λ_2) times the interproduct 
must be 0.
But we said that λ_1 and λ_2 were
different. So this number is not 0.
So this number must be 0, we just
divide both sides equation by
λ_1 - λ_2 and we get the interproduct
is 0. So we've shown that the eigenvalues
are real. We've shown that 
the eigenvectors with different
eigenvalues are orthogonal and
the big deal is that it's diagonalizable
and that's what we are gonna show
in the next video.
