now i am not mapping any three variables to
some three variables i want to map dotted
line segments on to data line segments what
is the data line segment it is a geometrical
object which will have multiple representation
that is multiple three numbers representing
the same vector ok so now then how do i construct
such a linear functions will map ah which
is invariant to this multiple representations
of a data line segment is a question ok to
answer that we have to look into tensor algebra
ok now what is tensor algebra is a tensor
a second order tensor why this is called as
second order will see shortly the second order
tensor is ah linear mapping between two directed
line segments that is if x and y are directed
line segments 
then y equal to a times x where a is a second
order tensor ok
now i want to generate representation for
this linear function just like i generated
representations for the vector ok now to [du/do]do
that we have to understand one more product
involving two vectors that is called as a
tensor product between vectors a and b is
defined as a tensor product b acted upon by
c is defined as b dot towards c times a where
c is any vector and a tensor product b is
a second order tensor ok is a definition of
a second order tensor ok so this can be written
as a times c equal to x where a is a tensor
product b and x in this case happens to be
b dot c times a that is a vector scaled by
the position of b on c ok this is the second
order tensor which is special second order
tensor because its not a general second order
tensor its an example of a second order tensor
a tensor product b is an example of a second
order tensor ok
now now let me go ahead and define a general
second order tensor 
a is written as a i j e i tensor product e
j where i have to sum i equal to one two three
and i have to sum j equal to one two three
what this means is this a one one e one tensor
product e one plus a one two e one tensor
product e two plus a one three e one tensor
product e three plus a two one where this
is e two tensor product e one plus a two two
e two tensor product e two plus a two three
e two tensor product e three plus a three
one e three tensor product e one plus a three
two e three tensor product e two plus a three
three e three tensor product e three ok this
is a linear combination of nine tensor product
and here e one e two e three are the ortho
normal basis vectors ok that is the coordinate
basis that you choose to represent any dot
line segments ok by the way a tensor product
is called dyad and ah linear combination of
tensor products or dyads is called as a dyadic
ok
now what i am interested is how to find the
components a i j find the components a i j
ok we are written y vector as a times x right
ok now what is the component of y i y i is
y dotted with e i so allow from here this
will imply a of x dotted with e i is y i from
from here this implies a of x dotted with
e i is this now what is x x i can write it
as a times x j e j dotted with e i ok that
is how i represent x so that will be x j scalar
so i pull it out of the equation e j dotted
with e i ok now if i say this is a i j then
what i get is i get this vector y i equal
to this matrix times a times this vector x
because this is nothing but a i j x j i am
summing up the column of the matrix with the
column vector j ok so this is a valid operation
matrix vector multiplication operation and
hence you have this representation for a times
x ok in what i have done here is i represented
this components of a what i have done here
is i represented this components of a in terms
of a three by three matrix if the corresponding
first index denoting the row and second index
denoting the column ok thats what i have done
here when i write it like this ok
now so we find that a i j would be now we
find that from this a i j is a times e j dotted
with e i right thats how we find the components
so i can write generally second order tensor
can be written as a times e j dotted e i e
i tensor product e j here i and j are dummy
index because they are repeated twice on the
same side of the equal to sign which means
i have top sum them from one two three ok
now lets find the components of let us find
the components of of the tensor a tensor product
b ok i have to the apply the same rule so
a tensor product b the i j component would
be a tensor product b acted upon by e j dotted
with e i so this is nothing but from the definition
of the tensor product a b dotted with e j
dotted with e i this is nothing but a dotted
with e i times b dotted with e j ok this nothing
but a i b j
so the matrix components of a tensor product
b can be written as a one b one a one b two
a one b three a two b one a two b two a two
b three a three b one a three b two a three
b three this is nothing but a one a two a
three multiplied by b one b two b three that
is if this is a three by one vector this a
one by three vector so the product is the
product is a three by three matrix in other
words this is a vector times b transpose if
i view if vectors or column vector ok if i
view the vector means i arrange the components
of a vector as a column then this called as
a column vector so then it will be a b transpose
ok in contrast a scalar product between two
vectors a dotted with b would be a transpose
b here a transpose will be one by three this
is three by one answer is a one by one scalar
value ok so thats what a tensor product means
tensor product means of two vectors is a times
b transpose ok
lets move a little bit further and understand
what this what is basis vector basis tensor
basis mean ok now i am interested in finding
e one tensor product e one this will be one
zero zero one zero zero so this is nothing
but one zero zero zero zero zero zero zero
zero similarly e one tensor product e two
would be one zero zero zero one zero which
will be zero one zero zero zero zero zero
zero zero in general e i tensor product e
j means only the i jth component is one and
the rest are zeros rest are the components
are zeros ok so what does that mean it means
just like for a vector you have one zero zero
zero one zero zero zero one as the basis vectors
for the tensor representation one of te matrix
entries around being one and rest of them
being zero are the basis vectors for representing
the tensor ok
so this forms e i tensor product e j forms
the basis for second order tensor ok e i tensor
product e j forms the basis for the second
order tensor ok now finally lets look at whether
any matrix is ah second order tensor answer
is emphatically no because it doesnt transform
the way a tensor should transfer any matrix
need not it is just a three by three entry
of numbers when the coordinate system change
there is no meaning as so how the components
of the matrix will change for example transformation
matrix q is a matrix and its not a second
order tensor ok so any matrix will not be
a second order tensor but second order tensors
can be represented as a matrix ok one of the
representation of second order tensors is
a matrix ah another representation is i can
write the components of a as a nine by one
vector also a one one a two two a three three
a one two like this i can continue as a nine
by one vector also the components these are
mere representations of a second order tensor
for convenience of calculations what is this
true meaning just like a data line segment
has a direction and ah magnitude fixed irrespective
of how you represent it there is a fixed representations
for a second order tensor that is a set of
matrices will form together as second order
tensor the next class what will do is will
find the components of a stress tensor and
then will go to forward to look at how a stress
tensor will transform to changes in coordinate
basis there will study more circle where an
all the points on the circle will represent
the same stress state or a same tensor which
represent the stress ok
so with this will conclude this lecture what
you are seen is you are seen that the traction
in general depends upon the time ah position
vector in the space on the normal or it depends
upon a time the particle that you are looking
at and the normal to the cut surface that
passes through that particle ok and then we
went ah read and looked what a linear function
is and then we went again and looked what
the tensor algebra is how to represent a linear
function of two dot line segments ok
thank you
