In this video we're going
to talk about the linear algebra
of rotations in three dimensions,
in particular the group SO(3)
of rotations.
And the Lie algebra,
little so(3).
So first we need some
definitions.
Big SO(3), capital S,
capital O, 3,
is the group of all rotations
in three dimensions.
The S and the O stands
for special orthogonal.
This is the three dimensional
special orthogonal group.
Orthogonal means that they're
orthogonal matrices and
special means that the
determinant is +1
as opposed to -1.
Little so3 is the set of all three
by three antisymmetric real matrices
and this is called the Lie algebra
of big SO3.
This is usually written
with gothic letters,
I can't draw gothic letters.
Whenever we're using the Lie
algebra I'm going to write it in red.
So when you see little s-o in red,
that's the Lie algebra.
Big S-O is the group.
So what the heck is a Lie algebra
and what the heck is a group?
Well, first of all let's go
look at the group.
A group is a set of elements sets
that whenever you multiply two of them
they're still in the group.
Whenever you take the inverse
of one of them it's still
in the group.
And I claim that SO3
has this property.
If R and T are rotation matrices,
then I claim that Rn versus a
rotation matrix and that R * T
is a rotation matrix.
Let's see how that works.
See we need to show,
to be an SO3 it has
to be orthogonal which is
to say that the transposed
times the matrix has to be 
the identity.
And it needs to have a
determinant of +1.
So we need to show that
R inverse has this property.
And we need to show
that RT has this property.
So lets do R^-1 first.
Well since R was an orthogonal matrix,
R^-1 is just R transpose.
So if you take R^-1 transpose
times R^-1, that's R transpose transpose
in other words, R.
Times R transpose
and R * R transpose is
already the identity because
R and R transpose are inverses
and the determinant of R inverse
is the determinant of R transpose
which is the determinant of R
which is one.
So far so good.
The inverse of an element
of SO3 is an SO3.
What about the product?
Well if you take RT transpose
times RT and I apologize for using
the same letter T for transpose
as for matrix.
Then it's T transpose,
R transpose, RT.
This T for transpose,
R transpose R, T.
But R transpose R
is the identity.
So you get T transpose T,
and that's also the identity
because R is in the group
and T is in the group.
And the determinant of RT
is the determinant of R
times the determinant of T,
just 1 * 1 which is 1.
So we've just shown that
sure enough I called it
a Lie group, Lie means
that it's a continuous group.
And it really is.
The product of any two
rotation matrices is
a rotation matrix.
The inverse of any rotation matrix
is a rotation matrix.
Okay.
The commutator of two matrices
is defined to be the first times the second
minus the second times the first.
And this is often written in 
this bracket notation,
square brackets.
Square bracket of A and B
means AB - BA.
And a Lie algebra is a set
of matrices that form a vector space.
You could add the matrices,
you can multiply them by scalars
and such that the commutator of
any two matrices in your set
is also in the set.
And I claim that little so3
is in fact, Lie algebra.
So let's check.
They're obviously closed
under linear - you know,
under scalar multiplication
and addition.
But if A and B are a little so3,
remember we define little so3
to be the set of all
antisymmetric real matrices.
So if A and B are
anti symmetric,
A transposes minus A
and B transposes minus B.
And lets take the transpose
of the commutator.
That's the transpose of AB - BA
and when you take the transpose
of something, you take the
product of the transposes
in the opposite order, so the
transpose of AB is B transpose,
A transpose, and the transpose
of BA is A transpose B transpose.
But B transposes minus B
and A transposes minus A
so this is -B * -A,
in other words, BA.
And this is -A * -B,
in other words, AB.
So the transpose of the commutator
is minus the commutator.
So the commutator is antisymmetric,
which is what we want.
Okay. So we've got our Lie group
and our Lie algebra.
And how the heck are they related?
Well I claim that if you give me
an antisymmetric matrix
that is in the element of the algebra,
its exponential is in the group.
So here's why.
If A is in the algebra,
well its † just means it's transposed
because it's a real matrix and that's -A
so this means that -A * A†,
the conjugate of -iA†,
conjugate of -i is i
times A† just minus iA because
A† is -A.
So (-iA)† is -iA so -iA
is Hermitian.
So if you take e^A, that's e^i
times something Hermitian,
we saw in the last video that 
e^i times a Hermitian matrix
is unitary, so e^A is unitary. But
e^A is also real because A was real.
You exponentiate a real matrix,
you get it by the power series
if you want and you get a real matrix.
So it's real and unitary and
that makes it orthogonal.
What's more, A is anti-symmetric
so it's zero on the diagonal.
So its trace is zero.
The determinant of the exponential
is the exponential of the trace
which is e^0 which is 1.
So anytime you exponentiate
an anti-symmetric matrix,
you get a rotation matrix.
You know, it's orthogonal,
it has a determinant of 1.
Now the eigenvalues of A,
since -iA was Hermition,
-iA has real eigenvalues which
means that A has to have pure
imaginary eigenvalues.
They come in conjugate pairs
so if iΘ is an eigenvalue,
-iΘ is an eigenvalue,
and the remaining eigenvalue
has to be zero.
The eigenvalues of e^A arer e^0,
e^iΘ and e^-iΘ, and we saw before
that you can figure out the axis
and the angle of rotation from
the eigenvalues and eigenvectors
of a rotation matrix.
The axis is the eigenvector
with eigenvalue 1 and
the angle is Θ.
So e^A is a rotation by an angle Θ
about an axis which is this eigenvector.
The eigenvector with eigenvalue 1
for e^A or 0 for A.
So for example, let's look at this matrix.
It's antisymmetric, and its eigenvalues
wind up being 0, iπ/3, and -iπ/3.
The way you can figure out what
the eigenvalues are is you look
at the trace of A^2. 
That has to be the sum of the
eigenvalues squared so that's 
0^2 + iΘ^2 + (-iΘ^2) so
that's -2Θ^2
so the trace of A^2 you can figure
out what Θ is.
In this case, it's iπ/3.
Sorry, Θ is π/3 and the
eigenvector with eigenvalue 0
is just 111, you see that the sum
of all the rows is zero.
That if you multiply this by 111,
you get the zero vector.
That means this must - e^A
must be a rotation by π/3
about this axis.
And if you work it out,
it comes out to this matrix
which we've seen before.
We've seen that this is
a special orthogonal matrix
which is a rotation about the
1,1 axis - 1,1,1 axis by π/3.
