in todays lecture ah we are going to talk
about rotational matrices eigenvalues and
eigenvectors so so far we have already seen
certain special kind of matrices we saw orthogonal
matrices and ah what we said is that ah orthogonal
matrices they leave the length of a vector
the same so [now ro/now rotation] now rotational
matrices are again defined in terms of transformations
of vectors
so so we can ask the question what is the
matrix that transforms a vector in the following
way so suppose you have a vector i will just
take ah i will start with two dimensional
space just to illustrate my point ok then
you can go to three d space or other spaces
as required suppose you have a vector in two
dimensional space that is represented by some
arrow ok so this is my vector and ah let me
call this vector x y and ah lets say i want
to take this vector and rotate it by an angle
theta ok so rotated by an angle theta ok and
ah then i get some vector x prime y prime
ok
so now now ah now so so i can write this in
the following form so i have a vector x prime
y prime which is obtained from taking the
vector x y and rotating it ok so the transformation
so so you want a transformation that takes
this vector x y two this vector x prime y
prime and as as we said before ah general
linear transformation is represented by a
matrix and so this matrix this two dimensional
matrix this two cross two matrix is what is
called the rotational matrix and i will just
call it r of theta ok so this is rotation
by angle theta 
and you can easily see that r of theta has
the following form its a two by two matrix
and ah you can you can easily work out by
by ah looking ah just by basic trigonometry
you can work out that this is x prime this
is y prime this is x this is y ok so so you
can easily work out how how x will be obtained
from y ah how x prime will be obtained from
x and y ok and ah and i wont i wont work out
the details but ah this matrix has this form
cos theta minus sine theta sine theta cos
theta ok so so so this is the rotation matrix
so and ah you know you know just to just to
emphasize we had we got this from the expression
x prime equal to x times cos theta minus y
times sine theta and y prime is equal to x
sine theta plus y times cosine of theta ok
you can you can easily work this out its not
ah its not very difficult now now and so and
so and so we immediately we saw that the rotational
matrix is given by this matrix ok
now ah you would expect that since you are
only rotating the vector you are not changing
its magnitude and therefore you expect that
ah that r of theta should be orthogonal it
is an orthogonal matrix and and you would
expect that ah r of theta transpose should
be equal to r of theta inverse or in other
words r of theta transpose times r of theta
is the identity which is in this case it is
given by one zero zero one ok so now what
is r of theta transpose so r of theta transpose
is basically you have cos theta and now you
will have a oh you will have a sine theta
here you will have a minus sine theta here
and you will have a cos theta ok and ah you
can clearly see that ah this transpose is
corresponds to rotation by by minus theta
ok so suppose you are rotating by minus theta
then ah then ah then you can you can easily
see that ah if you are rotating by minus theta
then cos of minus theta is same as cos theta
sine of minus sine minus theta will be plus
sine theta this will be minus sine theta this
will be cos theta so you can easily see that
if you rotate by minus theta you will get
such a matrix and you can clearly see that
if you take a if you take a vector rotated
by theta and then again rotated by minus theta
you will get back the identity matrix so you
will get back the original vector ok
so so ah so so you can you can see in in other
words suppose i take ah suppose i take r theta
transpose r theta ok so first i and i operated
on on this vector x y ok this is equal to
x y ok so so so clearly r is an orthogonal
matrix ok this is called the matrix of rotations
ok now ah now in this in this operation ah
when you did this rotation ok you ah you can
ask the question what is the axis about which
we rotated ok now ah do in this case in this
case ah we imagine that this axis is perpendicular
to the plane of the paper ok and this and
you are rotating by an angle theta about this
axis ok so so what this means is that if you
had a vector in three dimensions ok so so
generalize to three dimensions ok now ah just
imagine that ah you rotate by rotate about
z axis a vector about z axis by angle theta
so in other words you you have ah you have
ah something like x y z ok and you rotate
by rotate about z axis by angle theta ok so
now now ah what would be the what would be
the rotate what would be this matrix ok so
what is the matrix r z of theta so now you
can ah you can easily see how how to how to
get this matrix ok so so what we will do is
ah we can i will just write the expression
for r z of theta ok and you and ah i will
motivate the answer ok and you can easily
verify this
so so if you rotate any vector about the z
axis by angle theta ok then the first thing
is that if i take any then the [z com/z coordinate]
z coordinate of the vector will be unchanged
ok so since the z coordinate is unchanged
ok so ah you can immediately see that ah z
prime will be equal to z ok and just ah so
this if we dont denote by x prime y prime
z prime ok
so clearly z prime has to be equal to z ok
now ah that that implies that this part should
be zero zero one zero zero one z prime has
to be equal to z and ah x prime equal to x
cosine theta minus y sine theta and y prime
equal to x sine theta plus y cos theta ok
so they are independent of x a of ah of x
and y and so this will just be the the same
rotational matrix the same two by two rotational
matrix that we had before that is r z of theta
ok so so r z of theta is this three cross
three rotational matrix ok which has ah zero
zero one and ah on along the ah along the
third direction ok
so this corresponds to rotation around z by
theta ok and there you can clearly verify
that this is orthogonal ok very easy to see
so verify orthogonal ok that is r z of theta
r z of theta transpose is nothing but the
identity ok in other words r z of theta transpose
equal to r z of theta inverse is equal to
r z of minus theta so that is rotation by
minus theta about z axis is same as inverse
of rotation by about of the z axis by theta
and that is exactly equal to the transport
of transpose of this matrix of rotations ok
so so ah so so these rotational matrices are
extremely useful ok in ah in lot of ways often
we want to understand symmetries of molecules
ok then we often use these rotational operations
and now we have a general way to rotate any
vector ok about ah in this case we have chosen
the z axis ok what about ah lets say if you
want to rotate about the x axis by an angle
phi ok so then then in this case you can easily
work out r x of phi will be given by ok now
in this case the x coordinate is the one that
is unchanged so the x coordinate is the one
that is not changing you will have ah instead
of zero zero one in this case ah in the in
the case of r z you had the z coordinate that
was not changing so you had a one here ok
and these were these were two zeros now in
this case you have the one here and these
are zeros ok and ah the rest part will look
very similar
so so now instead of ah you will have a cos
phi minus sine phi a sine phi and a cosine
phi ok and again you can verify that this
is orthogonal and ah you can also verify that
ah that ah it is the the transpose is nothing
but its inverse you can also do rotation about
the y axis ok and ah you can you can ah you
can do rotations so so can do i will just
write can also have rotation about y axis
i will also say some other things you can
rotate about rotate about arbitrary axis this
is more complicated
so in other words if you have your ah if you
have your coordinate system like this if this
is x y z ok and you have some vector ok i
will show it in red if you have some vector
like this and now if you if you imagine that
you want to rotate this vector by some angle
about some axis that could be something completely
different there could be an axis like this
ok and then we are really thinking in terms
of three d so so you could have an axis like
this and you could rotate by angle by some
angle ah theta about this axis so so this
vector is rotated and it ends up somewhere
here 
ok
so then then then you could ask the question
what is the what is the matrix for this row
so ah so so you are rotating by angle rotating
about this axis by angle theta so what is
now this is a considerably more difficult
question so so this is more more complicated
but ah can be worked out ok and ah i i want
i wont detail the steps that you need to work
it out but ah but you can definitely look
up various books and see see how to work this
out ok so ah so so i just wanted to mention
that these rotation matrices are quite ah
ah quite useful ok ah you could ah you could
also consider you could also consider things
like ah products of rotations
for example you could say first rotate about
x axis by theta and then about about ah z
axis by phi so suppose you add something like
that so the corresponding matrix would be
given by would be given by something like
this so so first the first operation is by
is by r x by theta ok then the next operation
because because your vector vector will come
to the right of this ok so so this is where
your this is where your vector will come so
the next operation is by r z by phi so so
ah you will write it in this form you will
write r z by phi times r x by theta ok
so so all these are ah all these are things
that you can do with the rotational matrices
and ah and you know when we are doing the
when we are doing the when we are doing the
the exercise we will see we will see some
examples of ah of using these rotation matrices
ok
now ah so so what does the rotational matrix
do it takes a vector and it rotates it by
some angle keeping the length fixed ok now
ah now now the next concept that i want to
talk about is is that of eigenvalues and eigenvectors
and ah let me emphasize this part of a matrix
ok and ah its its very important to understand
that ah this this idea of eigenvalues and
eigenvectors is formulated based on ah based
on the idea that you you have you are given
a matrix and you want to find its eigenvalues
and eigenvectors ok
so so so let me write so given a matrix a
ok we can find 
some vector x and ah some scalar lambda such
that a x equal to lambda x and if we can do
this then x is called eigenvector or let me
let me put a vector arrow just to make sure
that you dont confused and that you dont dont
get confused with this eigenvector of a with
eigenvalue lambda ok so so so this eigenvalue
corresponds to this eigenvector and vice versa
this eigenvector corresponds to this eigenvalue
so ah if you have a if you have a different
eigenvector you will have a different eigenvalue
ok so so so you could have something like
this you could have a you could have a two
eigenvalue eigenvector pairs
so for example you could have something like
ah a x one equal to lambda one x one and a
x two equal to lambda two x two notice for
the same same matrix a ok i have i have a
pair of eigenvalues so example this this is
nothing but an example ok you could have two
you could have three you could have four you
could have as many as as many you you could
have different numbers ok
so so now now now what is happening here lets
lets think in terms of transformation ok so
so in terms of transformation what we are
doing is a x is a transformation of a vector
its a its a linear transformation of vector
to give you another vector ok so so so what
we are saying is that you are given a matrix
a ok it it takes a vector and gives you another
vector
now given this matrix a you we are asking
what are the what is a possible vector which
when operated by a ok just gives another vector
in the same direction ok so so the question
is ah so the point is there so so ah so when
a when a acts on x it yields a vector parallel
to x so in other words it yields something
in the same direction it does not change the
direction of x ok so so it yields a vector
that is in the same direction as of x ok
so that is what is meant by by by an eigenvector
and an eigenvalue so eigenvectors represents
those directions which are preserved ok now
ah and some interesting things about eigenvectors
so suppose suppose ah x is an eigenvector
of a with eigenvalue lambda ok its its its
its important that each vector is connected
with an eigenvalue or each eigenvalue is connected
with an eigenvector ok so so x is an eigenvector
of matrix a which eigenvalue lambda ok
now now suppose i take ah c times x where
c is a scalar c is a scalar ok now ah c times
x so let me say y is a vector that is c times
x ok so ah c times x is another vector ok
now you can clearly see that a times y is
equal to a times c x is equal to c times a
x c is just a scalar you can take it to the
left ok so now now a x is lambda x so it is
c times lambda x and ah this is i can i can
switch the lambda and c and i can write this
as lambda y
so in other words if you just look at this
[eq/equation] equation so so we get a y equal
to lambda y ok so what this says is that y
is an eigenvector of a with eigenvalue lambda
ok so so basically if you take an eigenvector
multiplies by constant you will get another
eigenvector with the same eigenvalue ok so
thats why thats why eigenvectors really refer
to directions and not magnitudes ok eigenvectors
refer to directions not magnitudes because
you can always multiply an eigenvector by
a by a constant and get another eigenvector
so so so so when you talk about distinct eigenvectors
we want eigenvectors pointing along different
directions
now how do you determine eigen eigenvalues
and eigenvectors of a matrix ok so how will
we determine the eigenvalues and eigenvectors
of a matrix so suppose you want to to determine
eigenvalues and eigenvectors of a we have
an equation a x equal to lambda x and you
solve for x and lambda so with this one equation
you want to solve for lambda and x ok now
ah lets say lets say in ah in three d space
ok so x has three components and ah lambda
is a scalar it looks like there are four unknowns
and there are only three equations so a x
equal to lambda x represents three equations
its a its a vector equation and since you
are in three d space you have three equations
but ah so so its you have three equations
and and you have four things that you have
to determine ok
so it appears like that but as we will see
eigenvectors only refer to directions and
not magnitudes ok so so we dont really need
to worry about the magnitude ok so you you
can in fact determine the directions the distinct
direction so so basically of these three components
ah we can only determine two independently
and one is ah two two or two can be determined
and one can be chosen independently
lets just ah let us try to work this out how
will you how will you go about ah doing this
so now a x equal to lambda x ok so so so what
we will do is ah we can we can write this
as a x equal to lambda times i x where i is
the identity matrix ok and lambda i so i is
equal to one zero zero zero one zero zero
zero one and lambda i is nothing but lambda
zero zero zero lambda zero zero zero lambda
ok so then i can write a minus lambda i so
i take the lambda i to the left multiplied
by x vector is equal to the zero vector ok
and ah this is a system of of homogeneous
of or or ah sorry yeah system of homogeneous
linear equations ok
so this is a system of homogeneous linear
equation and we already mentioned that the
non trivial will that is x not equal to zero
solution exists if determinant of a minus
lambda i equal to zero so we already saw when
we were trying to in the in the problem set
from the previous module we saw that when
we wanted to to look at linear independence
of three vectors in three dimensional space
we got the system of homogeneous linear equations
and we said that the non trivial solution
that is ah x not equal to zero solution exists
only if this determinant is equal to zero
ok
so so now we have [addi/additional] additional
condition so ah and this will help us get
your eigenvalue so so the determinant of of
a minus lambda i equal to zero ok now this
ok so ah so so what will this look like ok
so this will look like ah so this this is
the matrix a minus lambda i so ah if you if
you if you write your usual your usual notation
ok so you say a one one a has these components
a one one a one two a one three a two one
a two two a three three a three one a three
two a three three ok now if you subtract lambda
i ok so you have this minus lambda i is lambda
zero zero zero lambda zero zero zero lambda
ok so ah this so so so this is a minus lambda
i and you have the determinant of this equal
to zero ok
so so what this look like so this is this
look like determinant of a one one minus lambda
a one two a one three a two one a two two
minus lambda a two three a three one a three
two a three three minus lambda this determinant
equal to zero the determinant of this is equal
to zero and you can clearly see that when
you take the determinant you will get ah you
will get ah this if you just look at the diagonal
term you will have a term that involves lambda
cube so this is a this so this is a cubic
polynomial in lambda so the left hand side
is a cubic polynomial in lambda and ah so
so that implies that there are three roots
ok
so we have a cubic polynomial lambda equal
to zero so you have three roots ok so so basically
you can determine three three eigenvalues
lambda one lambda two lambda three i will
just call them lambda one lambda two lambda
three so you can determine three eigenvalues
for this equation ok and ah you can take each
eigenvalue ok so corresponding to lambda one
to each eigenvalue value we can determine
determine corresponding eigenvector ok
so for example for example if you have lambda
one ah we write the corresponding eigenvector
as ah x one y one z one ok so if you write
the eigenvector in this form ok eigenvector
these are the components of the eigenvector
ok then you can clearly show that ah since
you have a times x one y one z one is equal
to lambda times x one y one z one ok then
what you have is ah is you have the equation
a minus lambda i times x one y one z one of
a minus lambda one i algebra lambda one is
equal to zero and this is a system of equations
and you can solve for x one y one z one ok
and ah since this is a homogeneous equation
ok ah you can only determine two of them independently
the third one you can or or two of two of
them you can determine ok if you fix the third
one ok and we will see examples of this as
we go ok but the point is now ah we know how
to calculate eigenvalues and eigenvectors
ok ah of a of a matrix and this is ah this
is probably the most ah i i i i can emphasizes
so this is probably the most important use
of matrices 
important concept ah or or most important
i will i will further i will say i will say
in terms of utilization ok so this is the
probably the most important use of matrices
ok
and ah so so i will so so ah i will stop todays
lecture here and just to remind ourselves
we first learnt about rotational matrices
and then we learnt about eigenvalues and eigenvectors
thank you
