.
In the last lecture we discussed weak convergence,
we covered 2 modes of convergence . That is
convergence in probability , and convergence
in the mean square sense . So, these are weak
convergence modes because we consider some
functions for example, here we consider a
probability ; that X n minus x this is greater
than any epsilon , arbitrarily small epsilon
. So, this probabilities should go down to
0 as n tends to infinity
Similarly, here we consider E of X n minus
x whole square this quantity goes down to
0 as n tends to infinity . So, this is the
ah convergence in mean square sense and convergence
in probability and they are interrelation
also we discussed .
Ah We will not discuss another weak convergence
concept and ah this is in fact; this is the
weakest concept ah. So, we will ah discuss
about convergence in distribution. Ah So let
us see the definition considered a random
sequence X n this is the sequence from n is
equal to 1 to infinity suppose and a random
variable X .
Suppose F X n x and F x of x are the distribution
functions of X n and X respectively . This
sequence is said to converge to x, now the
sequence the sequence is said to converge
to x in distribution , if corresponding distribution
function sequence converges . So, what does
it say? If limit of F X n of x as n tends
to infinity is equal to F x of x . Now this
is for all x at which F x of x is continuous
. So, this is another requirement that this
limit should be equal to this for all x at
which F x of x is continuous . And this convergence
we write as X n converges in distribution
to X, and sometimes it is also called X n
converges in law to X.
Now, essentially here the distribution functions
sequence of distribution functions converges
to the distribution function of the random
variable X at l at all points of continuity
. So, we have to note 2 things first of all
ah we are ah considering the limit of disciplines,
distribution function sequence . So, ah this
limit may not be a distribution function there
is no guarantee that this limit will be a
ah distribution function, but we will say
ah ah that it is convergence in distribution
only if this converges to a distribution function.
For example, suppose you consider a function
F X n of x is equal to 1 for x greater than
equal to n suppose is equal to 0 otherwise
.
So, this function ah now this function will
convert to as n tends to infinity ah, so this
function will converge to 0. So, ah this F
X n a limit of F X n of x as n tends to infinity,
it will become as n tends to infinity because
x when we increase that; so it will 0 will
extend, so it will ah become 0 . So, which
is not a distribution function therefore,
ah we have to ensure that this distribution
function sequence converges to a distribution
function .
Second thing is F X n of x converges to F
x of x for all x at which F x of x is continuous
that is important . For example, if we consider
suppose F X n of x is equal to 0 x less than
1 by n is equal to suppose 1 x greater than
equal to 1 by n
Now, if I have suppose F x of x is equal to
suppose 0 , x less than equal to 0 suppose
x less than 0 is equal to 1 x greater than
equal to 1 . Now ah this is this essentially
mean the deterministic function that x is
equal to this is this essentially means that
X is equal to 0 .
Now, if I consider this limit of this as n
tends to infinity . So, ah this will ah become
1 for x greater than 1 . And it will, but
at n is equal to at X is equal to 0 it will
be always equal to 0, unlike ah unlike this
function, so for this function for x is equal
to 0 it is for x greater than equal to 1,
it is x greater than equal to 0 it is always
equal to 1 .
But at this point x is equal to 0 ah this
this is not converging because F F X n of
x as n tends to infinity , at point x is equal
to 0 it will become 0 . So, that way ah this
convergence is ah they are only at points
ah at which x is F x of x is continuous . So,
at x is equal to 0 F x of x is not continuous
.
So, we do not bothered about ah the convergence
at that point, but any other point ah where
F x of x is continuous this F X n of x converges
then we say that ah this sequence converges
in distribution. We can consider another example
suppose this is the pdf; that is X all Xs
are uniformly distributed. The sequence of
in the independent random variables with each
random variable X i uniformly distributed
between 0 to a .
Now, in this case we define a sequence F Z
n, so F Z n of Z now if I consider F Z n of
z ok. So, what does it means this is that
probability that Z n is less than equal to
small z, what does it mean; that probability
probability that maximum of X 1 X 2 up to
X n , that is less than equal to Z . So, this
is now since maximum is less than that, so
each of ah these variables random variables
will be less than equal to Z . Therefore,
this is same as the probability that X 1 is
less than equal to Z, X 2 is less than equal
to Z like that X n is less than equal to Z,
so we can write like this
Now, we can use the independent property,
so, that way this will be equal to probability
of X 1 less than equal to Z to the power n
ok . Now this is uniformly distributed, so
therefore ah we know this is 1 by the pdf
therefore, ah this quantity suppose ah this
will be equal to 0 for ah z less than equal
to 0 so less than 0 suppose.
Now, for that lying between for z lying between
0 and a; we can write this as because if I
integrate then I will get that by a z by a
to the power n, n is equal to 1, 1 for z greater
than a this is the pdf . So, if I now ah this
is the CDF, F Z n of z is given by this . If
I ah do the limit take the limit then I will
get this F z limit of F Z n of z is equal
to F z of z is equal to 0 for z less than
equal to a one for z greater than equal to
a .
So, therefore, Z n converges to z is equal
to a in distribution . Now ah we have ah considered
ah the concept of convergence in distributed
distribution. Ah How is it related to convergence
in probability. So, if X n converges to x
in probability then it is it implies that
X n also converges in distribution to X. We
will see the proof suppose proof 
X n converges in probability to X, what does
it say ? It says the limit of probability
of X n minus x absolute value of X n minus
X greater than any epsilon arbitrarily small
epsilon that is equal to 0.
Now, we will consider ah the second random
variable this is X n sequence and X this random
variable. Suppose ah at x plus small epsilon,
that is x plus epsilon we will consider a
partition. Consider these subjects suppose
I I have this real line. So, this is x plus
epsilon . So, I will consider ah 2 events
that is X greater than, this side is X greater
than small x plus epsilon. And this side is
X less than this is X less than so this is
X less than x plus epsilon. So, ah this set
is this subset and this is another subset
at point x plus epsilon we have a partition
on the real line
So now ah we want to find out the distribution
function. So, for that let us consider the
event X n less than equal to small x . This
we can consider by using the total probability
theorem we can find out this probability . So,
ah considering these 2 partition . So, this
is X n less than equal to small x, and X less
than equal to small x plus epsilon . Union
X n less than equal to small x, X greater
than small x plus epsilon. So, this event
we have ah considered as the union of 2 disjoint
events .
So, therefore, ah F X n of x we can write
that that is equal to probability of X n less
than equal to small x. Now considering that
partitions we write that this is equal to
probability of X n less than equal to small
x this is camma; that means, n x less than
equal to small x plus epsilon plus probability
of X n less than equal to small x, capital
X is greater than x plus epsilon. So, these
2 probability we have considered yeah this
is ok .
Now, this is a probability of joint event
probability of X n less than equal to small
x and X less than equal to small x definitely
if I consider only one event, then we can
write that this is less than equal to probability
of X less than equal to small x plus epsilon.
Now considering this event, so we consider
this event X n less than equal to small x
X greater than equal to small x plus epsilon.
So, ah this is a subset of this event mod
of X n minus X. Here we see that if I take
the difference between X and X n , that is
that will be greater than epsilon . So, that
way this event is a subset of this event therefore,
again we will have less than equal to this
probability . So, therefore, F X n of x is
less than equal to this probability plus this
probability ok.
Now, what happens as n tends to infinity,
we are given that ah the sequence converges
in probability. So, this term will become
0 as n tends to infinity . Therefore, what
we get is that limit of F X n of x is less
than equal to F x of x plus epsilon. So, ah
this is that is CDF of X n that is at point
x is less than equal to the CDF of x at point
x plus epsilon.
Now, we can consider again we can consider
a partition on R F point x minus epsilon suppose
x minus epsilon . Now we will consider the
partition in terms of the random variable
X n. So, that X n is less than equal to small
x minus epsilon and X n is greater than small
x minus epsilon. So, that way we can consider
a partition and in the same way we can continue
. And then we can prove that F x of x small
x minus epsilon that is, less than equal to
limit of as n tends to infinity F X n of similarly,
we can consider we can consider a partition
on the real line at point x minus epsilon
, involving the random variable X n. So, that
X n is less than equal to small x minus epsilon
and X n is greater than equal to small x minus
epsilon, these are the 2 event suppose .
So, then we can show that F x of F x of x
minus epsilon that is less than equal to limit
of F X n of x as n tends to infinity. Here
we show that limit F X n of x as n tends to
infinity it is less than equal to small capital
F x of x at x plus epsilon. So, this inequality
we showed earlier, now we show that ah F x
of x minus epsilon is less than equal to limit
of F X n of x as n tends to infinity.
So, ah from this one and 2 we get that that
limit F X n of x lies between F x at point
x minus epsilon and F x at x plus epsilon,
where epsilon is arbitrarily small number.
Therefore, as n tends to infinity ah this
limit F X n of x will be equal to F x of x.
So, this says that if sequence X n is convergent
in probability it will be convergent in distribution
also.
Now the converse of this theorem is not true;
that means, convergence in distribution did
not imply that convergence in probability.
We considered a sample space S is equal to
HT, head and tail suppose with probability
of tail and probability of head equal that
is equal to half .
Now, ah we define a sequence of random ah
variables suppose X n of H is equal to 1,
X n of T is equal to 0. Another random variable
we define X of H it this other way X of H
is equal to 0, X of T is equal to 1 . In the
case of X n X n of H is equal to 1, here X
of H is equal to 0 .
Now, if I have to draw the distribution function
of both suppose this is n this is x F X n
of x. Ah Now X n a of x is equal to 1. So,
probability of head ah suppose at 1 this is
the 1, and so this is 0 so up to here it will
be half. So, here half and then this will
go to 1 like that this is 1. So, this is the
ah CDF of X n of S or lCDF of X n that is
if I consider X this side F x of x that at
X is equal to 0 ah that will be ah corresponding
to tail. So, that the half probability at
1 that is a ah circulative probability will
be equal to 1
Similarly, in the case of F x of x also, if
I consider this is suppose x F x of x here
also ah this will be equal to F 0 it will
be equal to half and then this is at 1 again
it will be equal to 1 ok .
So now these 2 distributions functions are
coinciding, but here ah we have to remember
that this mappings are different therefore,
let us see what happened to the probability
of the event, where X n x are different, considerably
different. So, that is we consider the probability
of those sample points where ah X n and x
X n minus x is absolute value of that is greater
than epsilon.
So, for that ah now let us see what is the
events of S such that mod of X n, ah X n s
minus x s mod of mod of that is greater than
epsilon . Now whenever s is equal to head
X n of head is equal to 1 and X of H is equal
to 0 therefore, ah their difference will be
greater than epsilon. So, head will be included
here . When S is equal to T, suppose X n of
T will be equal to 0, but X of T will be equal
to 1. Therefore, their difference is also
greater than epsilon therefore, T is also
included here .
So, the probability of this set will be equal
to 1 therefore, this does not go down to 0;
that means, X n does not converges in probability
to X. So, this is a counter example which
shows that ah convergence in distribution
does not imply convergence in probability.
So, that way we have discussed ah now convergence
in distribution. So, earlier we ah considered
convergence in probability convergence in
mean square sense now another we concept convergence
in distribution .
ah Also ah if suppose X is continuous this
is also theorem, but we will not prove this
just this results we will state . Suppose
if X n sequence is continuous continuous 
and X is also continuous . 
Then X n converges in distribution to X this
is if and only if F X n of x this is pdf converges
to F X of x that is pdf of x at almost all
x ok . So, this is a theorem; that means,
if ah X n converges in distribution then corresponding
density function also converges in the case
of ah case of continuous random variables.
If we consider a sequence of continuous random
variables if this sequence is convergent in
distribution, then they are pdfs will also
converge. And this is if and only if condition
therefore, if suppose pdf sequence is converges
then also the sequence ah will converge in
distribution. That means, corresponding distribution
sequence, distribution function sequence will
also converge .
Similarly, we can also write in terms of probability
mass function if X n ah converges , suppose
if X n is discrete . So, in that case ah an
X is also discreet then X n converges in distribution
to X , if and only if ah that probability
sequence that is probability mass function
at point suppose k , converges to p x of k
at all discrete k at all discrete k . So,
what does it say that if a sequence of ah
random variable convergence in distribution
then ah this convergence can be studied in
terms of their ah pdf and pmfs.
ah Next we will state another important theorem
without any proof, this is required when we
study the application of this concept this
mode of convergence. We will study the central
limit theorem to prove that it is required
. So, this is the continuity theorem of convergence
what does it say , suppose F X n of x and
F x of x are the distribution functions of
X n and x respectively and M X n of s is equal
to E of e to the power s n of X n. So, what
is it is? This is the MGF moment generating
function . So, this is moment generating function
MGF .
Similarly, M X s is E of e to the power s
X that is the MGF of MGF of x ok . So, ah
now we have a ah CDF sequence and we have
a MGF sequence . If this sequence M X n of
s, that is the MGF sequence for X n if it
converges to M X of s near s is equal to 0.
And M X of s is continuous at s is equal to
0 this is another requirement . Then ah corresponding
this CDF sequence will also converge or in
other words X n will converges to X in distribution
.
So, ah; that means, if we have to prove that
X n is converges and if X n converges to X
in distribution then we can start with the
MGF sequence. If we take the moment generating
function of X n, and if it converges to the
moment generating function of x at near s
is equal to 0. And M X sub s is continuous
at s is equal to 0 then we can prove that
X n converges in distribution to X. So, starting
ah the moment generating function. If the
moment generating ah function sequence that
will be another sequence. If that sequence
converges to the moment generating function
of X then we can assume that according to
this theorem, ah X n will converge in X to
this in distribution .
So, this is a continuity theorem of convergence
and this is one important result which we
will be using. Let us see the entire relation
between different convergence modes. We have
considered ah convergence everywhere of course,
ah that is convergence for every sample point.
That is a very strong sense of convergent,
convergence this is a very strong sense of
convergence. So, this will imply convergence
in ah convergence with probability 1 or convergence
almost sure. So, this convergence everywhere
because this is the strongest it will imply
convergence almost sure. This in turn will
imply convergence in probability and finally,
convergence in probability implies convergence
in distribution.
There is another type of convergence that
is convergence in mean square. So, this can
be consider this neither of this ah may imply
this. So, that therefore, this is considered
independently and this converges suppose X
n converges in mean square sense. This will
always imply convergence in probability and
convergence in probability implies convergence
in this division .
So, that where we have considered this 5 modes
of convergence convergence everywhere, convergence
almost sure, convergence in probability, convergence
in distribution and convergence in mean square
sense. So, convergence in distribution is
the weakest mode of convergence. Now, we will
see some application of this convergence concept.
We are interested particularly about 2 results
2 important results.
They are known as a laws of large number,
and second one is central limit theorem theorem
. So, ah here we have weak law of large numbers,
strong law of large numbers. So, that way
ah these will apply the corresponding weak
and strong ah modes of convergence, then central
limit theorem this is a this is an application
of the convergence in distribution .
Let us summarize the key points of the lecture,
a random sequence X n is said to converge
to X in distribution that is; X n converges
in distribution to X. If limit of F X n of
x equal to F x of x for all x at which F x
of x is continuous. This is the condition
we impose ah. So, that way we define convergence
in distribution. And it is the weakest mode
of convergence, also we proved the theorems
X n converges in probability implies that
X n convergence converges in distribution.
With a counter example we show that the converse
of the theorem is not true, that is convergence
in distribution does not imply convergence
in probability.
Then we stated the continuity theorem of convergence,
if the moment generating function M X n of
s converges to M X of s, near s is equal to
0. And M X of s is continuous at s is equal
to 0, then ah F X n of s converges to F x
of s, that is distribution function of X n
converges to distribution function of X. So,
this is one important theorem that is the
continuity theorem of convergence
Now, let us summarize the relation between
different convergence mode of random sequence,
that is X n converges almost sure implies
that X n converges in probability; implies
that X n converges in distribution. Similarly,
X n convergence in mean square sense implies
that X n converges in probability, this in
turn implies that X n converges in distribution.
Thank you .
