The following content is
provided under a Creative
Commons license.
Your support will help
MIT OpenCourseWare
continue to offer high quality
educational resources for free.
To make a donation or to
view additional materials
from hundreds of MIT courses
visit MIT OpenCourseWare
at ocw.mit.edu.
PROFESSOR: Today
we are continuing
with improper integrals.
I still have a little bit
more to tell you about them.
What we were discussing at
the very end of last time
was improper integrals.
Now and these are going
to be improper integrals
of the second kind.
By second kind I mean that
they have a singularity
at a finite place.
That would be
something like this.
So here's the
definition if you like.
Same sort of thing as we
did when the singularity
was at infinity.
So if you have the integral
from 0 to 1 of f(x).
This is going to be the
same thing as the limit,
as a goes to 0 from above,
the integral from a to 1
of f(x) dx.
And the idea here is the same
one that we had at infinity.
Let me draw a picture of it.
You have, imagine a function
which is coming down like this
and here's the point 1.
And we don't know whether
the area enclosed is
going to be infinite or
finite and so we cut it off
at some place a.
And we let a go to 0 from above.
So really it's 0+.
So we're coming in
from the right here.
And we're counting up
the area in this chunk.
And we're seeing as it expands
whether it goes to infinity
or whether it tends
to some finite limit.
Right, so this is the example
and this is the definition.
And just as we did for the
other kind of improper integral,
we say that this converges --
so that's the key word here --
if the limit is finite,
exists maybe I should just say
and diverges if not.
Let's just take care
of the basic examples.
First of all I wrote
this one down last time.
We're going to
evaluate this one.
The integral from 0 to 1 of
1 over the square root of x.
And this just, you almost
don't notice the fact
that it goes to infinity.
This goes to infinity
as x goes to 0.
But if you evaluate it -- first
of all we always write this
as a power.
Right?
To get the evaluation.
And then I'm not even going
to replace the 0 by an a.
I'm just going to leave it as 0.
The antiderivative here
is x^(1/2) times 2.
And then I evaluate
that at 0 and 1.
And I get 2.
2 minus 0, which is 2.
All right so this
one is convergent.
And not only is it convergent
but we can evaluate it.
The second example,
being not systematic
but really giving you
the principal examples
that we'll be thinking about,
is this one here, dx / x.
And this one gives
you the antiderivative
as the logarithm.
Evaluated at 0 and 1.
And now again you have
to have this thought
process in your mind that
you're really taking the limit.
But this is going to be the
log of 1 minus the log of 0.
Really the log of 0 from above.
There is no such thing as
the log of 0 from below.
And this is minus infinity.
So it's 0 minus minus infinity,
which is plus infinity.
And so this one diverges.
All right so what's
the general--
So more or less in general,
let's just, for powers anyway,
if you work out this thing
for dx / x^p from 0 to 1.
What you're going to find is
that it's 1/(1-p) when p is
less than 1.
And it diverges for p >= 1.
Now that's the final result. If
you carry out this integration
it's not difficult.
All right so now
I just want to try
to help you to remember this.
And to think about how
you should think about it.
So I'm going to say
it in a few more ways.
All right just repeat
what I've said already
but try to get it to
percolate and absorb itself.
And in order to do
that I have to make
the contrast between the
kind of improper integral
that I was dealing with before.
Which was not as x goes to 0
here but as x goes to infinity,
the other side.
Let's make this contrast.
First of all, if I
look at the angle
that we have been paying
attention to right now.
We've just considered
things like this.
1 over x to the 1/2.
Which is a lot smaller than 1/x.
Which is a lot smaller
than say 1/x^2.
Which would be another example.
This is as x goes to 0.
So this one's the smallest one.
This one's the
next smallest one.
And this one is very large.
On the other hand it goes
the other way at infinity.
As x tends to infinity.
All right so try to
keep that in mind.
And now I'm going to put a
little box around the bad guys
here.
This one is divergent.
And this one is divergent.
And this one is divergent.
And this one is divergent.
The crossover point is 1/x.
When we get smaller
than that, we
get to things which
are convergent.
When we get smaller than
it on this other scale,
it's convergent.
All right so these
guys are divergent.
So they're associated
with divergent integrals.
The functions
themselves are just
tending towards-- well
these tend to infinity,
and these tend toward 0.
So I'm not talking about
the functions themselves
but the integrals.
Now I want to draw this
again here, not small enough.
I want to draw this again.
And, so I'm just going
to draw a picture
of what it is that I have here.
But I'm going to combine
these two pictures.
So here's the picture
for example of y = 1/x.
All right.
That's y y = 1/x.
And that picture
is very balanced.
It's symmetric on the two ends.
If I cut it in half then what
I get here is two halves.
And this one has infinite area.
That corresponds to the
integral from 1 to infinity,
dx / x being infinite.
And the other piece, which --
this one we calculated last
time, this is the one that
we just calculated over here
at Example 2 -- has
the same property.
It's infinite.
And that's the fact that the
integral from 0 to 1 of dx
/ x is infinite.
Right, so both, we
lose on both ends.
On the other hand if I
take something like --
I'm drawing it the same way
but it's really not the same --
y = 1 over the square root of x.
y = 1 / x^(1/2).
And if I cut that in half here
then the x^(1/2) is actually
bigger than this guy.
So this piece is infinite.
And this part over
here actually is going
to give us an honest number.
In fact this one is finite.
And we just checked
what the number is.
It actually happens
to have area 2.
And what's happening here
is if you would superimpose
this graph on the
other graph what you
would see is that they cross.
And this one sits on top.
So if I drew this one in
let's have another color here,
orange let's say.
If this were orange if
I set it on top here
it would go this way.
OK and underneath the
orange is still infinite.
So both of these are infinite.
On here on the other hand
underneath the orange
is infinite but underneath
where the green is is finite.
That's a smaller quantity.
Infinity is a lot bigger than 2.
2 is a lot less than infinity.
All right so that's reflected
in these comparisons here.
Now if you like if I want
to do these in green.
This guy is good and
this guy is good.
Well let me just repeat that
idea over here in this sort
of reversed picture
with y = 1/x^2.
If I chop that in half then
the good part is this end here.
This is finite.
And the bad part is
this part of here
which is way more singular.
And it's infinite.
All right so again what
I've just tried to do
is to give you some
geometric sense and also
some visceral sense.
This guy, its tail as it goes
out to infinity is much lower.
It's much smaller than 1/x.
And these guys trapped an
infinite amount of area.
This one traps only a
finite amount of area.
All right so now I'm
just going to give one
last example which combines
these two types of pictures.
It's really practically
the same as what
I've said before but I-- oh
have to erase this one too.
So here's another example:
if you're in-- So let's
take the following example.
This is somewhat
related to the first one
that I gave last time.
If you take a function
y = 1/(x-3)^2.
And you think
about its integral.
So let's think about the
integral from 0 to infinity,
dx / (x-3)^2.
And suppose you were
faced with this integral.
In order to understand
what it's doing
you have to pay attention to two
places where it can go wrong.
We're going to split
into two pieces.
I'm going say break it up
into this one here up to 5,
for the sake of argument.
And say from 5 to infinity.
All right.
So these are the two chunks.
Now why did I break it
up into those two pieces?
Because what's happening
with this function
is that it's going
up like this at 3.
And so if I look at
the two halves here.
I'm going to draw
them again and I'm
going to illustrate them
with the colors we've chosen,
which are I guess red and green.
What you'll discover
is that this one
here, which corresponds to
this piece here, is infinite.
And it's infinite
because there's
a square in the denominator.
And as x goes to 3 this
is very much like if we
shifted the 3 to 0.
Very much like this 1/x^2 here.
But not in this context.
In the other context where
it's going to infinity.
This is the same
as at the picture
directly above with the
infinite part in red.
All right.
And this part here,
this part is finite.
All right.
So since we have an infinite
part plus a finite part
the conclusion is that this
thing, well this guy converges.
And this one diverges.
But the total
unfortunately diverges.
Right, because it's
got one infinity in it.
So this thing diverges.
And that's what
happened last time
when we got a crazy number.
If you integrated this you
would get some negative number.
If you wrote down the
formulas carelessly.
And the reason is that
the calculation actually
is nonsense.
So you've gotta be
aware, if you encounter
a singularity in the
middle, not to ignore it.
Yeah.
Question.
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: Why do we say that
the whole thing diverges?
The reason why we say that is
the area under the whole curve
is infinite.
It's the sum of this
piece plus this piece.
And so the total is infinite.
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: We're stuck.
This is an ill-defined integral.
It's one where your red flashing
warning sign should be on.
Because you're not
going to get the right
answer by computing it.
You'll never get an answer.
Similarly you'll never
get an answer with this.
And you will get an
answer with that.
OK?
Yeah another question.
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: So the
question is, if you
have a little glance
at an integral,
how are you going to decide
where you should be heading?
So I'm going to
answer that orally.
Although you know, but I'll
say one little hint here.
So you always have to check
x going to infinity and x
going to minus infinity,
if they're in there.
And you also have to check any
singularity, like x going to 3
for sure in this case.
You have to pay attention
to all the places
where the thing is infinite.
And then you want to focus
in on each one separately.
And decide what's going on
it at that particular place.
When it's a negative power-- So
remember dx / x as x goes to 0
is bad.
And dx / x^2 is bad.
dx / x^3 is bad.
All of them are even worse.
So anything of this form
is bad: n = 1, 2, 3.
These are the red box kinds.
All right.
That means that any
of the integrals
that we did in partial
fractions which
had a root, which had
a factor of something
in the denominator.
Those are all
divergent integrals
if you cross the singularly.
Not a single one of them makes
sense across the singularity.
Right?
If you have square
roots and things
like that then you can
repair things like that.
And there's some interesting
examples of that.
Such as with the arcsine
function and so forth.
Where you have an improper
integral which is really OK.
All right.
So that's the best I can do.
It's obviously something
you get experience with.
All right.
Now I'm going to move on
and this is more or less
our last topic.
Yay, but not quite.
Well, so I should say it's
our penultimate topic.
Right because we have
one more lecture.
All right.
So that our next
topic is series.
Now we'll do it in a sort
of a concrete way today.
And then we'll do what are
known as power series tomorrow.
So let me tell you about series.
Remember we're
talking about infinity
and dealing with infinity.
So we're not just talking
about any old series.
We're talking about
infinite series.
There is one infinite
series which is probably,
which is without
question the most
important and useful series.
And that's the
geometric series but I'm
going to introduce it concretely
first in a particular case.
If I draw a picture of this sum.
Which in principle
goes on forever.
You can see that it goes
someplace fairly easily
by marking out what's
happening on the number line.
The first step takes
us to 1 from 0.
And then if I add this
half, I get to 3/2.
Right, so the first step was
1 and the second step was 1/2.
Now if I add this quarter in,
which is the next piece then
I get some place here.
But what I want to
observe is that I got,
I can look at it from
the other point of view.
I got, when I move this quarter
I got half way to 2 here.
I'm putting 2 in green
because I want you to think
of it as being the good kind.
Right.
The kind that has a number.
And not one of the red kinds.
We're getting there
and we're almost there.
So the next stage we
get half way again.
That's the eighth and so forth.
And eventually we get to 2.
So this sum we write equals two.
All right that's
kind of a paradox
because we never get to 2.
This is the paradox
that Zeno fussed with.
And his conclusion, you know,
with the rabbit and the hare.
No, the rabbit and the tortoise.
Sorry hare chasing-- anyway,
the rabbit chasing the tortoise.
His conclusion--
you know, I don't
know if you're aware of this,
but he understood this paradox.
And he said you know it
doesn't look like it ever
gets there because they're
infinitely many times
between the time-- you know that
the tortoise is always behind,
always behind, always
behind, always behind.
So therefore it's
impossible that the tortoise
catches up right.
So do you know what
his conclusion was?
Time does not exist.
That was actually
literally his conclusion.
Because he didn't
understand the possibility
of a continuum of time.
Because there were infinitely
many things that happened
before the tortoise caught up.
So that was the reasoning.
I mean it's a long time ago
but you know people didn't-- he
didn't believe in continuum.
All right.
So anyway that's a small point.
Now the general case here
of a geometric series
is where I put in a number
a instead of 1/2 here.
So what we had before.
So that's 1 + a + a^2...
Isn't quite the most general
but anyway I'll write this down.
And you're certainly going
to want to remember that
the formula for this in
the limit is 1/(1-a).
And I remind you that this only
works when the absolute value
is strictly less than 1.
In other words when -1
is strictly less than a
is less than 1.
And that's really the
issue that we're going
to want to worry about now.
What we're worrying about is
this notion of convergence.
And what goes wrong when
there isn't convergence,
when there's a divergence.
So let me illustrate the
divergences before going on.
And this is what we
have to avoid if we're
going to understand series.
So here's an example when a = 1.
You get 1 + 1 +
1 plus et cetera.
And that's equal to 1/(1-1).
Which is 1 over 0.
So this is not bad.
It's almost right.
Right?
It's sort of infinity
equals infinity.
At the edge here we
managed to get something
which is sort of almost right.
But you know, it's, we don't
consider this to be logically
to make complete sense.
So it's a little dangerous.
And so we just say
that it diverges.
And we get rid of this.
So we're still
putting it in red.
All right.
The bad guy here.
So this one diverges.
Similarly if I take a equals
-1, I get 1 - 1 + 1 - 1 + 1...
Because the odd and the
even powers in that formula
alternate sign.
And this bounces back and forth.
It never settles down.
It starts at 1.
And then it gets down to 0
and then it goes back up to 1,
down to 0, back up to 1.
It doesn't settle down.
It bounces back and forth.
It oscillates.
On the other hand if you
compare the right hand side.
What's the right hand side?
It's 1 / (1-(-1)).
Which is 1/2.
All right.
So if you just paid attention
to the formula, which
is what we were doing when we
integrated without thinking too
hard about this,
you get a number
here but in fact that's wrong.
Actually it's kind of
an interesting number.
It's halfway between the
two, between 0 and 1.
So again there's some
sort of vague sense
in which this is trying
to be this answer.
All right.
It's not so bad but we're still
going to put this in a red box.
All right.
because this is what
we called divergence.
So both of these
cases are divergent.
It only really works when
alpha-- when a is less than 1.
I'm going to add
one more case just
to see that mathematicians are
slightly curious about what
goes on in other cases.
So this is 1 + 2 +
2^2 + 2^3 plus etc..
And that should be equal to --
according to this formula --
1/(1-2).
Which is -1.
All right.
Now this one is
clearly wrong, right?
This one is totally wrong.
It certainly diverges.
The left hand side is
obviously infinite.
The right hand side is way off.
It's -1.
On the other hand it
turns out actually
that mathematicians have ways
of making sense out of these.
In number theory
there's a strange system
where this is actually true.
And what happens
in that system is
that what you have to
throw out is the idea
that 0 is less than 1.
There is no such thing
as negative numbers.
So this number exists.
And it's the additive
inverse of 1.
It has this arithmetic
property but the statement
that this is, that 1 is bigger
than 0 does not make sense.
So you have your choice,
either this diverges
or you have to throw
out something like this.
So that's a very curious
thing in higher mathematics.
Which if you get to number
theory there's fun stuff there.
All right.
OK but for our purposes
these things are all out.
All right.
They're gone.
We're not considering them.
Only a between -1 and 1.
All right.
Now I want to do
something systematic.
And it's more or less on
the lines of the powers
that I'm erasing right now.
I want to tell you
about series which are
kind of borderline convergent.
And then next time when we
talk about powers series we'll
come back to this very
important series which
is the most important one.
So now let's talk about some
series-- er, general notations.
And this will help
you with the last bit.
This is going to be pretty
much the same as what
we did for improper integrals.
Namely, first of all I'm
going to have S_N which
is the sum of a_n, n
equals 0 to capital N.
And this is what we're
calling a partial sum.
And then the full limit, which
is capital S, if you like.
a_n, n equals 0 to infinity,
is just the limit as N goes
to infinity of the S_N's.
And then we have the same kind
of notation that we had before.
Which is there are
these two choices which
is that if the limit exists.
That's the green choice.
And we say it converges.
So we say the series converges.
And then the other case which
is the limit does not exist.
And we can say the
series diverges.
Question.
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: The question
was how did I get to this?
And I will do that next
time but in fact of course
you've seen it in high school.
Right this is-- Yeah.
Yeah.
We'll do that next time.
The question was how
did we arrive-- sorry I
didn't tell you the question.
The question was
how do we arrive
at this formula on the
right hand side here.
But we'll talk about
that next time.
All right.
So here's the basic
definition and what
we're going to
recognize about series.
And I'm going to give
you a few examples
and then we'll do
something systematic.
So the first example--
well the first example
is the geometric series.
But the first example that
I'm going to discuss now
and in a little bit of
detail is this sum 1/n^2,
n equals 1 to infinity.
It turns out that this
series is very analogous --
and we'll develop this
analogy carefully --
the integral from
1 to x, dx / x^2.
And we're going to develop
this analogy in detail later
in this lecture.
And this one is
one of the ones--
so now you have to go back
and actually remember,
this is one of the ones you
really want to memorize.
And you should especially
pay attention to the ones
with an infinity in them.
This one is convergent.
And this series is convergent.
Now it turns out that
evaluating this is very easy.
This is 1.
It's easy to calculate.
Evaluating this is very tricky.
And Euler did it.
And the answer is pi^2 / 6.
That's an amazing calculation.
And it was done very early in
the history of mathematics.
If you look at another example--
so maybe example two here,
if you look at 1/n^3, n equals--
well you can't start here at 0
by the way.
I get to start wherever
I want in these series.
Here I start with 0.
Here I started with 1.
And notice the reason
why I started--
it was a bad idea to start
with 0 was that 1 over 0
is undefined.
Right?
So I'm just starting where
it's convenient for me.
And since I'm interested
mostly in the tail behavior
it doesn't matter to me
so much where I start.
Although if I want
an exact answer
I need to start
exactly at n = 1.
All right.
This one is similar
to this integral here.
All right.
Which is convergent again.
So there's a number
that you get.
And let's see what is it
something like 2/3 or something
like that, all right,
for this for this number.
Or 1/3.
What is it?
No 1/2.
I guess it's 1/2.
This one is 1/2.
You check that,
I'm not positive,
but anyway just doing
it in my head quickly
it seems to be 1/2.
Anyway it's an easy
number to calculate.
This one over here stumped
mathematicians basically
for all time.
It doesn't have any kind of
elementary form like this.
And it was only very recently
proved to be rational.
People couldn't even
couldn't even decide
whether this was a
rational number or not.
But anyway that's been resolved;
it is an irrational number
which is what people suspected.
Yeah question.
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: Yeah sorry.
OK.
I violated a rule
of mathematics--
you said why is this similar?
I thought that similar
was something else.
And you're absolutely right.
And I violated a
rule of mathematics.
Which is that I used this
symbol for two different things.
I should have written
this symbol here.
All right.
I'll create a new symbol here.
The question of whether this
converges or this converges.
These are the the
same type of question.
And we'll see why they're
the same question it
in a few minutes.
But in fact the wiggle
I used, "similar",
I used for the connection
between functions.
The things that are really
similar are that 1/n resembles
1/x^2.
So I apologize I didn't--
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: Oh you
thought that this
was the definition of that.
That's actually the reason
why these things correspond
so closely.
That is that the Riemann
sum is close to this.
But that doesn't
mean they're equal.
The Riemann sum only works
when the delta x goes to 0.
The way that we're going to get
a connection between these two,
as we will just a second, is
with a Riemann sum with-- What
we're going to use is a
Riemann sum with delta x = 1.
All right and then that will
be the connection between.
All right that's
absolutely right.
All right.
So in order to illustrate
exactly this idea
that you've just come
up with, and in fact
that we're going to use,
we'll do the same thing
but we're going to do it
on the example sum 1/n.
So here's Example 3 and
it's going to be sum 1/n,
n equals 1 to infinity.
And what we're now
going to see is
that it corresponds
to this integral here.
And we're going
to show therefore
that this thing diverges.
But we're going to do
this more carefully.
We're going to do
this in some detail
so that you see what it is,
that the correspondence is
between these quantities.
And the same sort
of reasoning applies
to these other examples.
So here we go.
I'm going to take
the integral and draw
the picture of the Riemann sum.
So here's the level 1 and
here's the function y = 1/x.
And I'm going to
take the Riemann sum.
With delta x = 1.
And that's going to be closely
connected to the series
that I have.
But now I have to decide
whether I want a lower Riemann
sum or an upper Riemann sum.
And actually I'm going
to check both of them
because both of them
are illuminating.
First we'll do the
upper Riemann's sum.
Now that's this staircase here.
So we'll call this the
upper Riemann's sum.
And let's check
what its levels are.
This is not to scale.
This level should be 1/2.
So if this is 1 and
this is 2 and that level
was supposed to be 1/2 and
this next level should be 1/3.
That's how the Riemann
sums are working out.
And now I have the
following phenomenon.
Let's cut it off
at the nth stage.
So that means that I'm
going, the integral
is from 1 to n, dx / x.
And the Riemann sum is
something that's bigger than it.
Because the areas are enclosing
the area of the curved region.
And that's going to be the
area of the first box which
is 1, plus the area of the
second box which is 1/2,
plus the area of the
third box which is 1/3.
All the way up the last one,
but the last one starts at N-1.
So it has 1/(N-1).
There are not N boxes here.
There are only N-1 boxes.
Because the distance
between 1 and N is N-1.
Right so this is N-1 terms.
However, if I use the
notation for partial sum.
Which is 1 + 1/2 plus all the
way up to 1/(n-1) 1 + 1/n.
In other words I go out
to the Nth one which
is what I would ordinarily do.
Then this sum that I have here
certainly is less than S_N.
Because there's one
more term there.
And so here I have
an integral which
is underneath this sum S_N.
Now this is going to allow
us to prove conclusively
that the-- So I'm just
going to rewrite this, prove
conclusively that
the sum diverges.
Why is that?
Because this term
here we can calculate.
This is log x
evaluated at 1 and n.
Which is the same
thing as log N minus 0.
All right, the quantity
log N - log 1 which is 0.
And so what we have here is
that log N is less than S_N.
All right and clearly this
goes to infinity right.
As N goes to infinity this
thing goes to infinity.
So we're done.
All right we've
shown divergence.
Now the way I'm going to
use the lower Riemann's sum
is to recognize
that we've captured
the rate appropriately.
That is not only do I have
a lower bound like this
but I have an upper bound
which is very similar.
So if I use the upper
Riemann-- oh sorry,
lower Riemann sum
again with delta x = 1.
Then I have that the
integral from 1 to n of dx
/ x is bigger than-- Well
what are the terms going
to be if fit them underneath?
If I fit them underneath
I'm missing the first term.
That is the box is
going to be half height.
It's going to be
this lower piece.
So I'm missing this first term.
So it'll be a 1/2 + 1/3 plus...
All right, it will
keep on going.
But now the last one
instead of being 1/(N-1),
it's going to be 1 over N. This
is again a total of the N-1
terms.
This is the lower Riemann sum.
And now we can recognize that
this is exactly equal to-- well
so I'll put it over
here-- this is exactly
equal to S_N minus 1,
minus the first term.
So we missed the first term but
we got all the rest of them.
So if I put this
to the other side
remember this is
log N. All right.
If I put this to
the other side I
have the other
side of this bound.
I have that S S_N is less than,
if I reverse it, log N + 1.
And so I've trapped
it on the other side.
And here I have the lower bound.
So I'm going to
combine those together.
So all told I have this
correspondence here.
It is the size of log N is
trapped between the-- sorry,
the size of S_N, which is
relatively hard to calculate
and understand exactly,
is trapped between log N
and log N + 1.
Yeah question.
AUDIENCE: [INAUDIBLE PHRASE]
PROFESSOR: This step
here is the step
that you're concerned about.
So this step is a
geometric argument which
is analogous to this step.
All right it's the
same type of argument.
And in this case it's that
the rectangles are on top
and so the area represented
on the right hand side
is less than the area
represented on this side.
And this is the
same type of thing
except that the
rectangles are underneath.
So the sum of the
areas of the rectangles
is less than the
area under the curve.
All right.
So I've now trapped
this quantity.
And I'm now going to state
the sort of general results.
So here's what's known
as integral comparison.
It's this double
arrow correspondence
in the general case,
for a very general case.
There are actually even
more cases where it works.
But this is a good
case and convenient.
Now this is called
integral comparison.
And it comes with hypotheses
but it follows the same argument
that I just gave.
If f(x) is decreasing
and it's positive,
then the sum f(n), n
equals 1 to infinity,
minus the integral from
1 to infinity of f(x) dx
is less than f(1).
That's basically what we showed.
We showed that the difference
between S_N and log N
was at most 1.
All right.
Now if both of them
are-- And the sum
and the integral converge
or diverge together.
That is they either both
converge or both diverge.
This is the type of test that
we like because then we can just
convert the question of
convergence over here
to this question of convergence
over on the other side.
Now I remind you that
it's incredibly hard
to calculate these numbers.
Whereas these numbers
are easier to calculate.
Our goal is to reduce
things to simpler things.
And in this case
sums, infinite sums
are much harder than
infinite integrals.
All right so that's the
integral comparison.
And now I have one
last bit on comparisons
that I need to tell you about.
And this is very much like
what we did with integrals.
Which is a so called
limit comparison.
The limit comparison
says the following:
if f(n) is similar to g(n) --
recall that means f(n) / g(n)
tends to 1 as n
goes to infinity --
and we're in the positive case.
So let's just say
g(n) is positive.
Then-- that doesn't
even, well-- then
sum f(n), sum g(n)
either both-- same thing
as above, either both
converge or both diverge.
All right.
This is just saying
that if they behave
the same way in the tail, which
is all we really care about,
then they have similar behavior,
similar convergence properties.
And let me give you
a couple examples.
So here's one example: if you
take the sum 1 over n^2 + 1,
square root.
This is going to be replaced
by something simpler.
Which is the main term here.
Which is 1 over
square root of n^2,
which we recognize as
sum 1/n, which diverges.
So this guy is one
of the red guys.
On the red team.
Now we have another example.
Which is let's say the square
root of n, I don't know,
to the fifth minus n^2.
Now if you have
something where it's
negative in the
denominator you kind of
do have to watch out that
denominator makes sense.
It isn't 0.
So we're going to be careful
and start this at n = 2.
In which case, the first
term, I don't like 1/0
as a term in my series.
So I'm just going to be a
little careful about how--
as I said I was
kind of lazy here.
I could have started this
one at 0 for instance.
All right.
So here's the picture.
Now this I just replace by its
main term which is 1 over n^5,
square root.
Which is sum 1/n^(5/2),
which converges.
All right.
The power is bigger than 1.
1 is the divider for these
things and it just misses.
This one converges.
All right so these
are the typical ways
in which these convergence
processes are used.
All right.
So I have one more
thing for you.
Which is an advertisement
for next time.
And I have this demo
here which I will grab.
But you will see this next time.
So here's a question for
you to think about overnight
but don't ask friends, you have
to think about it yourself.
So here's the problem.
Here are some blocks
which I acquired
when my kids left home.
Anyway yeah that'll happen to
you too in about four years.
So now here you are,
these are blocks.
So now here's the
question that we're
going to deal with next time.
I'm going to build
it, maybe I'll
put it over here because
I want to have some room
to head this way.
I want to stack them up
so that-- oh didn't work.
Going to stack them up
in the following way.
I want to do it so that
the top one is completely
to the right of the bottom one.
That's the question
can I do that?
Can I get-- Can I build this up?
So let's see here.
I just seem to be missing--
but anyway what I'm going to do
is I'm going to
try to build this
and we're going to see how far
we can get with this next time.
