So in this video we are taking about symmetric
matrices so when do we call a matrix symmetric
to be a symmetric matrix when the transpose
of the matrix equals the matrix itself now
notice this that in order for A to equal its
transpose we must have a as a square matrix
right so A must be a square matrix alright
okay now so let us take up a couple of examples
for instance if I look at a matrix here I
am just writing it just randomly alright and
you look at its transpose meaning changing
rows to columns and columns to rows you can
see right here that these two do not agree
right so in this case this matrix is not a
symmetric matrix okay so on the other hand
let us take up another matrix of the same
kind that is let us we have 2 1 now we know
that in order for this to be symmetric I have
to have these two identical does not matter
what that is so if I look at the transpose
here these are the same okay so there is one
particular feature of symmetric matrix that
we are going to consider in this lecture or
presentation that its diagonaliseability remember
this that you had seen that for a square matrix
to be diagonalizable that is in order for
that to be similar to a diagonal matrix okay
we must have say if the say the dimension
of the matrix are nxn okay then we must have
n linearly independent eigenvectors okay and
in this case like when we have the matrix
that we have or for a symmetric matrix A we
have these many good properties that are we
already have which will help the digonilzation
only possible but they will also make it much
nicer so for a symmetric matrix we can always
diagonalize it okay diagonalise it orthogonally
okay that means that we would be able to find
i.e. we can find an orthogonal matrix remember
what is an orthogonal matrix an orthogonal
matrix is a matrix whose columns are orthonormal
vectors okay so and orthogonal matrix P such
that so what here we have what our matrix
in the discussion as what A so what this means
is that we will have this equal to D where
D is an orthogonal matrix and I am sorry where
D is a dioganal matrix and P is an orthogonal
matrix so okay alright so let us take up an
example from the book because because if I
just make up an example from out of my head
it may not have good numbers okay so here
we go we are looking at this matrix 5 -4 -2
-4 5 2 -2 2 2 right so let us try to orthogonally
diagonalize this matrix okay so if we go for
it ah that for its eigenvalues and eigenvectors
what you will notice is that it has two eigenvalues
one is repeated and it is of multiplicity
2 and there is another one 10 now of course
you know this that if we would like we have
we can do this we got 1 1 0 oops and then
we can have this sorry about that and let
us write the other values so we got one half
zero and one and then negative two two and
one and of we take the inverse and multiply
this by the original matrix okay then take
it here so you have one one ten so this is
similar to a diagonal matrix okay now what
what we need is that we need these columns
to be orthonormal that is we need them to
be unit vectors okay and we also want them
to be orthogonal to each other now here you
see this vector and this vector they are orthogonal
to each other and so are this and this okay
because of the spectral theorem of the symmetric
matrices okay that we had referred to before
but these two are not so what we can do is
what we can do is that we can go ahead and
use remember what you called that your Gram
Schmidt process right or or orthogonalization
process so what we can have is that at this
is a basis and these are two linearly idependent
vectors so let us take that as our first member
of the set and then do this for the second
one what we have to do is that we will take
its component that is perpendicular to this
one so how do we do that we will take this
vector okay take the dot product here with
this alright and then right here okay and
then what we are going to is that we are going
to multiply this by this vector here and what
we have gotten is this vector and this vector
they are orthogonal o each other and if we
would like so let us make a unit vector along
let us create a unit vector along this one
so what you have is what will be a unit vector
along this let me just say it is 1 over square
root of two because square root of 2 is its
magnitude right and then we would also like
to have a unit vector along this vector okay
and instead of doing things in my head and
making mistakes let me just quickly compute
the length or magnitude of this vector okay
alright so you have then next you got this
negative sign then you have what one square
and if you look at it ah what do you get okay
3 over 4 square root of 2 okay or I am just
going to write it like this just for sake
of like simplicity later so in this case what
we are going to do is we are just going to
multiply the components by the reciprocal
of this vector so what do we get is right
here we get who 2 square root of 2 downstairs
we get times three here also we have 2 square
root of 2 and there we got this times three
right and then you have again 2 square root
2 over 3 and which is after that let us just
reduce it further so or ah what we got is
here we have 2 this 2 cancels and 2 times
3 is 6 we will have same thing here we got
this as 6 right and just to make sure that
we did our calculations correctly let us verify
that yes the dot product is 1 this also a
unit vector and this we are hvaing total confidence
in ourselves so what we got is that we have
we get this vector and then we got who this
vector right here and then the third one was
what let go back or this and this is not a
unit vector right so no problem see 4 plus
4 is 8 plus 1 is nine so here things are easy
its length is 3 so this is what this is a
set of orthonormal vectors from where from
you know from the eigenvectors alright of
the given matrix A so we will go ahead and
then just create these matrices okay so we
got this then this value and zero as our first
column right now let us make the other columns
quickly so here you go so then you have negative
square root of 2 over 6 or no that was negative
okay then negative square root of two over
6 and then this and you can see this here
that this type of orthogonal diagonalization
we can do it only for symmetric matrices okay
so keeping this and then we will take the
inverse of this alright and then what will
my original matrix it is right here and we
will again expect what we will again expect
one one ten remember one is an eigenvalue
of multiplicity two and here we go we got
one one 10 so now let us go for a payoff okay
and a payoff of this thing that is just using
it and we are going to use it in a real real
very simple situation so a payoff is this
that is say I have a function just a function
not necessarily a linear transformation okay
we are just taking a simple function r 2 to
R where this function does this that say we
got ah right here x 1 x 2 equals say three
x1 square okay minus 4 x1 x2 plus three x2
square okay say we got that right okay now
this is called a quadratic form so here you
have a you know a polynomial of degree two
and you can extend this to quadratic forms
from R n to R which are functions with values
of what polynomials of degree 2 okay in the
variables involved in the domain space okay
now we can do this that we can associate a
symmetric matrix to this quadratic form in
the follwoing fashion see this here the terms
of the pefect squares let us put those in
diagonal okay and the coefficient of the product
term okay x1 x2 if we have more variables
like x1 x2 x3 etc etc you arrange them in
a different way so what we will do is let
us take this coefficient here and divide it
by 2 so what we got next now we got a symmetric
matrix and this matrix is will come out to
be pretty handy and useful in the following
sense that if I took the transpose here took
the matrix okay and let us just take this
here again and what that will do is that that
will just go ahead and gave us the same functional
output okay here we go you can see these are
identical okay now since it is a symmetric
matrix okay let us go ahead and orthogonally
diagonalise it and then we will see that how
that diagonanalization helps us in the display
of various level curves of this function okay
so here we go we will go to to what to let
us find the eigenvectors so we have the eigenvectors
as 1 1 and -1 1 corrsponding to 1 and 5 okay
and so that will be pretty easy here we will
divide it by the lengths of the vectors and
the lengths are like quite easy here to handle
okay and ah this one becomes what this one
is negative one over square root of two one
over square root of two and if we do this
that if we go ahead and say diagonalise it
then what we shall have is that that this
alright and then excuse me get this right
and we will have the inverse then we have
the matrix and then we have our orthogonal
matrix okay and we got one five as our diagonal
the way we wanted okay so if we did this like
let us look at a level curve of this function
of two variables I am going to do this instead
of x1 x2 just to help this grapher graph this
function for me I am changing it to this and
you know I am doing the same thing let us
give it a value say 12 and I could have taken
it anything I wanted and if I plot it you
can see that what we got is we got an ellipse
okay and so here is your ellipse let me just
make ita little bolder so that we can see
it better in our picture come on okay it is
a tilted ellipse now what will happen is this
that if instead of taking our coordinates
with respect to this regular x and y say we
go along this and this so there will another
set of perpendicular axes and a vector aling
this you can see x y are x and y are the same
so I can graph and an axis like this okay
y equals x and for the other one remember
x and y are negatives of each other so if
I did this what will happen see with the change
of these coordinates my ellipse will come
out in a regular shape regarding the equation
what I mean is by that is the following that
is say we go ahead and take a new set of coordinates
call it u1 u2 along those new lines right
so we have u1 then we have what u2 and since
we are taking this to make the transformation
alright what we will have is this quantity
will be what this will have to be x1 x2 right
so here we go x1 and x2 right okay so notice
this that our the value that we have here
is what we just saw that that value is what
that value is this value right okay so if
I did this say we got this here right okay
now we already see who x 1 x2 is that is transpose
of this vector right we got transpose and
in the moddle we have our original assiciated
symmetric matrix and then we have the vector
itself okay so I am just doing your basic
algebra 1 in your linear algebra right so
what we have now is see when I take the transpose
what it will do is that the positions will
reverse so I have this and then I got the
transpose of this vector right it is an orthonormal
matrix what will happen the transpose and
the inverse will be the same in this case
and so we have this now now as I just said
that this will be the same as the inverse
of itself okay so so this will be the same
as the inverse of the base matrix and quicly
recall that okay quickly recall that what
was this this was 1 0 1 5 the diagonal matrix
right so yah this was diagonised so a payoff
that we get here is this that is the middle
one would be the diagonal matrix I do not
know why I erased that any way it will not
take us long to write that product okay what
was that 1 0 0 opps 5 okay and if you look
at its value alright what it becomes is u1
square plus five u2 square or I could have
taken it as u and v or whatever okay and then
this equation that you had taken here now
becomes like equation of a your regular regular
looking equation of an ellipse that is that
equals 12 alright so this is a payoff and
we will continue further with more examples
let me know if you have any questions okay
