In the next few videos, we're going to
talk about classes of operators that have
some very nice properties.
These operators are usually defined by
how they relate to their adjoints.
Before we can talk about those operators,
we have to talk about what adjoints are.
Let's suppose that V is a vector space
and it's got an inner product.
Let's suppose we have an operator
on that inner product.
The adjoint of that operator is
another operator which we denote
L^dagger, and that's supposed to be 
a little dagger, which has the idea that
if you apply L^dagger to the left hand
factor of an inner product,
it gives you the same results as applying
L to the right hand factor of
the inner product.
This has to work for all vectors in
our space. So for example,
let's suppose that we're in R2 with a 
standard inner product.
Let's suppose I give you an operator that
takes the vector x_1 x_2, and promotes
x_2 to the first position, and it puts
nothing in the second slot.
Then I claim that L-dagger does 
sort of the opposite.
It pushes x_1 down to the second position
and it puts nothing in the first slot.
To check that, we compare,
we see what happens if you take
x inner product with Ly.
Well that's [(x_1)(x-2)](y_2 0),
and that's (x_1)(y_2).
On the other hand, L^dagger x inner
product with y,
L^dagger x gives you (0 x_1)( y_1 y_2).
Well that also gives you (x_1)(y_2).
So these two things are equal and 
since this works no matter what x is
and no matter what y is, this is
the adjoint of this operator.
Okay, so that's the definition.
So what?
Let's figure out some properties
of adjoints.
The first is that they exist.
Now I say always.
If you've got infinite dimensional vector
spaces, there are some subtle issues
about domains. We're not going
to worry about them.
At finite dimensions, they always exist.
Infinite dimensions, they exist if you
make your definitions correctly.
The second is that the adjoint of
the sum of two operators,
it's just the adjoint of the first
plus the adjoint of the second.
The adjoint of a constant times an
operator is not the constant times
the adjoint. It's the complex conjugate
of the constant times the second.
The adjoint of a product of operators
is a product of the adjoints in the
opposite order.
And finally, the adjoint of the adjoint
is the operator itself.
You can prove these by sort of chasing -
well, except for one.
We're going to prove one later.
We're can to prove two through five
by definition chasing.
I'm just gonna prove four. And the
others are proved in similar ways.
For four, we want to figure out what is it
that when acting on x does the same as
LM acting on y?
Well LM acting on y is the same thing
as L acting on My.
By definition, L^dagger acting on x 
inner product with My is the same thing as
L acting on x, inner product with L 
acting on My.
L acting on the right is the same thing
as thing as L^dagger acting on the left.
Then we can move the M 
to the other side.
M^dagger acting on L^dagger(x) has 
the same effect as M acting on y.
So we get M^dagger acting on L^dagger(x)
and that's just (M^dagger)(L^dagger)
acting on x.
So (M^dagger)(L^dagger) acting on x gives
you the same result as LM acting on y.
So the adjoint of LM is
(M^dagger)(L^dagger).
Okay, but how do you get them?
Well, adjoints are actually very easy
if we're working with Rn with a standard
inner product.
If you're working with Rn, then of
course operators are nothing but
n by n matrices.
And I claim that the adjoint of an 
operator is just as transposed.
You're familiar with a tranpose.
In fact, you can think of adjoints is in
some sense a generalization
of transposes.
Why is that?
The inner product of x with a y, remember
the inner product of two vectors,
you take the transpose of the first
and you multiply it by the second.
The inner product of x with Ay is
(x^transpose)(Ay).
But you can think of that as 
(x^transposeA)y.
And (x^tranposeA)^transpose.
So this is the inner product of 
(A^transpose)x with y.
One way to look at that is see,
here is y, and if you put this together
this is Ay. And then you take the inner
product of x with Ay.
That's all well and good, but you could
just as well do this multiplication first.
You can take the row that you get when
you multiply this row by this matrix,
and then multiply that by y.
And this is (A^dagger)x and that inner
product with y is the same thing as
x inner product with Ay.
Whenever you have a row times a matrix
times a column, you can think of it as a
row times a matrix times a column 
or you can think of it as a
row times a matrix times a column.
If we go back to our original example, 
when we said L(x) gives you x_2 0,
that means the matrix of L is just
0 1 0 0. Its adjoint was 0 0 1 0.
It's just as transposed and L^dagger of
x_1 x_1 is 0 x_1.
Okay, that's Rn. What about Cn?
We have Cn with a standard inner product. 
Well the standard inner product for Cn
is that the inner product of x with y is
x^transpose conjugate, times y.
So there's gotta be a conjugacy in here.
In fact, the adjoint is the
transpose conjugate.
And the transpose conjugate of a matrix
is sometimes the Hermition conjugate.
So I claim that A^dagger is 
A^transpose conjugate.
In other words, the ij entry of A^dagger
is the ji entry of A conjugated.
Why? Well the inner product of x with Ay 
is x^transpose conjugate times Ay.
Then you put the parentheses this way
and you say,
that's x^transpose A conjugate, 
conjugate Y. So that's
(A conjugate^transpose x)^transpose
times y. And that's
A^transpose conjugate x times y.
For example, if you had an operator 
given by the matrix
1+i; 2+3i; 3-4i; 6i, you want to take its
adjoint, you just take its transpose.
The 3-4i goes up here and the 2+3i
goes down here but then you'd have
to conjugate things.
Instead of 3-4i, you get 3+4i. 
Instead of 2+3i, you get 2-3i.
Instead of 1+i, you get 1-i.
Instead of 6i, you get -6i.
Now we'll talk about adjoints in more
complicated spaces in the next video.
