Let A be an n-by-n matrix.  A is said to be
symmetric
if A is equal to A transpose.  For example,
these are all symmetric matrices.
Symmetric matrices are found in many
applications
such as control theory, statistical
analyses, and optimization.
There are two important properties
satisfied by real symmetric matrices.
The first is that such matrices
have only real eigenvalues.  The proof of
this requires a bit of
ingenuity but the 2-by-2 case is
rather straightforward.
The second property is that every real
symmetric matrix
is diagonalizable.  In fact, we can say a
bit more.
So, if A is a real symmetric matrix,
then A can be written as U times D
times U transpose
for some diagonal matrix D and some
matrix U satisfying
U transpose equals U inverse. A real
square matrix that satisfies this
is called orthogonal.
And if a matrix can be written like this,
then it's said to be
orthogonally diagonalizable.
The proof of this result is rather tricky.
Here we'll prove the result for the case
when the eigenvalues are distinct.
First, let's established the following
result:
If A is a real symmetric matrix
and u and v are eigenvalues of A with
distinct eigenvalues
lambda and gamma, respectively, then
u transpose times v
is equal to 0.  Or equivalently,
u and v are orthogonal.
To see this,
let's look at lambda times u transpose
times v.   This can be written as
lambda times u, transpose, times v.
But lambda times u is equal to A u
because u is an eigenvector of A with
lambda as eigenvalue.
So this can be rewritten as A times u
transpose times v.
And expanding the transpose, we have u
transpose times A transpose times v.
A transpose is the same as A because
A is symmetric.
So this is the same as u transpose
times A times v.
And now A times v is gamma times v.
And so this is equal to
gamma times u transpose times v.
And so I have established that lambda times
u transpose times v
is the same as gamma times u transpose
times v.
And if u transpose times v is non-zero,
then lambda is equal to gamma which is
a contradiction.
And so u transpose times v must be 0.
Now, let's take A to be a symmetric
n-by-n matrix with real entries.
And suppose that it has the eigenvalues
lambda_1 to lambda_n ,all distinct.
Remember we stated earlier that
real symmetric matrices have only real
eigenvalues.  So all these are real.
And since there are exactly n eigenvalues,
each eigenspace is spanned by a single
vector.
And we'll call that vector u_i for the
eigenvalue lambda_i.
because the eigenvalue is real
and the matrix A is real, these u_i's are
real.
And we're going to assume that
each u_i is a unit vector.  If not we
can replace u_i
by 1 over the norm of u_i
times u_i. 
This step here is called normalization.
And so
if we let u to be the n-by-n matrix
with columns u_1 up to u_n and D be the
diagonal matrix with lambda_1 up to
lambda_n
on the diagonal and zeros everywhere
else, then we can write
A as U times D times U inverse.
In order to complete the proof,
we have to show that U inverse here
is the same as U transpose.
Let's now look at the product U transpose
times U.
If you look at look at the (i,j)-entry
of this product,
it's given by row i of U transpose times
column j of U.  But row i of U transpose
is precisely column i of U, transposed.
And so we can write this as u_i transpose
times u_j.  But this
is 0 if i is not equal to j
because u_i and u_j are eigenvectors
of
A with different eigenvalues
and we are using this result here.
And it's 1 if
i is equal to j because if i is equal to j,
I'm just taking u_i transpose times
u_i and because
u_i is a unit vector, 
u_i transpose times u_i,
which is the square of the norm of the
vector u_i, it's 1.
And what this gives us is that U transpose
times U
is the same as the n-by-n identity matrix.
So what this proof shows here is that
in the case when the eigenvalues are distinct,
you simply follow the procedure for
diagonalizing A
and then you just normalize each eigenvector.
The resulting product that you get here
will give you an orthogonal
diagonalization of A
because U inverse will be equal to
U transpose.
We'll see an example of this in the next video.
