This lecture is all about apparent brightness,
luminosity, and distance.
We'll begin with apparent brightness and luminosity.
If you go outside on any clear night, you'll
immediately see that some stars are brighter
than other stars.
The difference in brightness does not by itself
tell us anything about how much light these
stars are generating, because the brightness
of a star depends on its distance as well
as on how much light it actually emits.
For example, the stars Procyon and Betelgeuse
appear about equally bright in our sky, but
Betelgeuse emits about 15,000 times as much
light as Procyon.
Procyon appears as bright because it's over
50 times closer.
Because two similar-looking stars can be generating
very different amounts of light, we need to
distinguish clearly between a star's brightness
in our sky and the actual amount of light
that it emits into space.
A star's apparent brightness is how it appears
to our eyes.
We define apparent brightness as the amount
of power reaching us per unit area, or per
square meter.
Luminosity is the total amount of power that
a star emits into space.
When we talk about how bright stars are in
an absolute sense, regardless of their distance,
we're talking about luminosity.
For example, a 100-watt light bulb always
puts out the same amount of light.
Its luminosity doesn't vary.
But its apparent brightness will depend on
how far away you are from the bulb.
The apparent brightness of a star, or any
light source obeys an inverse square law with
distance, much like the inverse square law
for the force of gravity.
In equation form, the inverse square law of
light says that the apparent brightness equals
luminosity over four pi times the distance
squared.
Consider this figure.
The same a total amount of light must pass
through each imaginary sphere surrounding
the star.
Pretend you're standing on the first sphere.
You draw a square meter on its surface and
you count the number of photons that pass
through your square each second.
Let's say you measure 4 photons.
Now move to the sphere that is twice as far
away and draw a square the same size.
If you count the number of photons that pass
through this square, you'll only count 1.
Because of the inverse square law, the light
spreads out.
Each square on the sphere twice as far away
receives only a quarter of the light as the
square on the first sphere.
A star's luminosity depends on apparent brightness
and distance.
In most cases we can easily measure apparent
brightness from Earth.
The distance is often more difficult to determine,
but we need it if we want to determine luminosity.
We'll discuss distance measurements in just
a bit.
To describe apparent brightness and luminosity,
we use what is called the magnitude system.
It was developed by the Greek astronomer Hipparchus
over 2000 years ago.
The magnitude system originally classified
stars based on how bright they look to human
eyes.
The brightest stars were called "first magnitude",
the next brightest "second magnitude" and
so on.
The faintest visible stars were magnitude
6.
These descriptions are called apparent magnitudes
because they compare how bright different
stars appear in the sky.
We denote apparent magnitudes with a lower-case
m.
The magnitude scale is such that a larger
number for apparent magnitude means a dimmer
apparent brightness.
A star of apparent magnitude 4 appears dimmer
in the sky than a star of magnitude 1.
Objects brighter than magnitude zero go negative.
For example the full Moon has an apparent
magnitude of minus 13.
The original magnitude scale was based on
the human eye.
Astronomers use a more precisely defined system
today.
The modern magnitude system also defines absolute
magnitudes as a way of describing stellar
luminosities.
A star's absolute magnitude is the apparent
magnitude it would have if it were at a distance
of 10 parsces for Earth.
We denote absolute magnitudes with an upper-case
M.
For example, the Sun's apparent magnitude
- how bright it appears to us is minus 27.
If we were to move the Sun 10 parsecs away,
it would appear dimmer and we would measure
an apparent magnitude of 4.8.
Therefore, the absolute magnitude of the Sun
is 4.8.
Remember, if we want to get at luminosity,
we need to know distance.
The most direct way to measure a stars distance
is with stellar parallax.
We'll discuss other methods of determining
distance later in the semester.
You may recall that parallax is the apparent
shift in position of a nearby object against
the background of more distant objects.
Astronomers measure stellar parallax by comparing
observations of a nearby star made six months
apart.
A nearby star will appear to shift against
the background of more distant stars because
we are observing it from two points of Earth's
orbit.
We can calculate a star's distance if we know
the precise amount of the star's annual shift
due to parallax.
This means measure the angle p, which we call
the star's parallax angle.
Note that p is equal to half the star's annual
back and forth shift.
The farther the star is, the smaller the parallax
angle becomes.
Therefore more distant stars have smaller
parallax angles.
Even the nearest stars to us have parallax
angles smaller than 1 arcesecond, well below
the angular resolution of the human eye.
This is why ancient Greeks were never able
to measure parallax.
By definition, the distance to an object with
a parallax angle of 1 arcsecond is 1 parsec.
If we use units of arcseconds for the parallax
angle, p, the distance in parsecs is simply
1 over the parallax angle.
For example, for a star with a parallax angle
of one-half of an arcsecond, the distance
is two parsecs.
Remember, if you know a star's distance you
can determine its luminosity.
For a star you have measured the parallax
angle for, you can calculate the distance,
and then you can measure the apparent brightness.
You have everything you need to determine
luminosity- the intrinsic brightness of the
star.
That's all for now.
We'll learn how astronomers measure stellar
temperature and mass in the next lecture.
Take care, I'll talk to you soon.
