Joe Paradiso: We've been really privileged
the last few years, to have a few new windows
open up to the universe, that really make
us all very excited.
Recent missions discovering extrasolar planets,
new techniques have been exploding that whole
field.
The LIGO gravitational wave direct detection
has just given us a whole new window on black
holes and gravitation.
And now this, the first image of a black hole.
We know for real now that they exist.
We've always known but we've never seen one.
Of course, you can't really see the black
hole, but now we see clearly its shadow.
When you go beyond the event horizon, you're
gone, and that's it.
And this is the first time we've ever seen
this, and the whole world has been totally
transfixed by this picture.
Joe Paradiso: We're really fortunate here
in Cambridge to have, as I called it a centroid
of the community, here, we have lots of the
participants locally based.
And fortunately, we've been able to have convinced
them to come down to the Media Lab, so close
to this great discovery, to share their insights
with us.
So it's a real privilege to host this event.
Joe Paradiso: We're gonna have a panel at
the end.
The speakers will come up and give different
kinds of insights into the project and into
their achievement, into their techniques.
And then we'll all come up here afterwards
and I'll start off with some questions, and
then we'll open up to the audience.
So this should be an informative and stimulating
and wonderful time.
Joe Paradiso: Michael is gonna introduce the
speakers.
He's the first speaker, but I'll introduce
him, and then we'll go from there.
Michael Johnson is an astrophysicist at the
Center for Astrophysics at Harvard and Smithsonian.
He received BS degrees in mathematics and
physics from the University of Southern California.
And a PhD in physics from UCSB, where he studied
pulsars and interstellar scattering.
Joe Paradiso: Michael joined the event horizon
telescope in 2013, and currently leads the
EHT Imaging Working Group.
In 2018, he was awarded the Secretary's Research
Award from the Smithsonian, for his work developing
new algorithms to reconstruct movies from
interferometric data.
He has contributed broadly across the EHT
project, from imaging methods to magnetic
field polarimetry, to the scattering and time
evolution of black holes.
So Michael, you can start us off.
Michael Johnson: Welcome, everyone.
Two days ago, the event horizon telescope
released this image, and this really is the
first ever picture of a black hole.
And MIT has played an absolutely foundational
role in getting us to this point.
And so, we just wanted to come share with
you the excitement of this moment, and then
tell you about the role of MIT in making it
possible.
Michael Johnson: So, first of all, we'll hear
from Vincent Fish.
Vincent is an MIT graduate MIT PhD, and he's
now a research scientist at MIT Haystack Observatory.
And so, Vincent will tell us about how the
instrument was created, how observations are
mounted, and how each year, we're making the
instrument better to get sharper and sharper
images of this black hole.
Michael Johnson: After that, we'll hear from
Lindy Blackburn, another MIT PhD.
And Lindy will tell us about how the data
are processed.
These are huge data volumes, petabytes of
data, that are reduced down to make this final
image.
And so, Lindy led the team that did all of
that data reduction, and we'll hear about
that process.
Michael Johnson: And after that, I'll tell
you how we turn that data into a picture and
what it means.
So, Vincent.
Vincent Fish: Thank you so much.
So we're very fortunate that nature has given
us a coincidence of sorts in that, when you
look at M87, for instance, and you look at
what is the size of the shadow, what is the
size of the photon ring, in M87.
You can work out that it should be about 40
microarcseconds or so, which is an impossibly
small angular resolution to have to get down
to.
But then, you look at what is the largest
array that you can field from the surface
of the earth.
And if you go to the shortest wavelength that
you can, at the time, which is 1.3 millimeters,
you find that you actually get an angular
resolution that is sufficient to make that
image.
Vincent Fish: So there are a number of coincidences
here.
One is that, another is that, when you look
at M87, for instance, at longer wavelengths,
you see quite a large jet.
As you go down to smaller and smaller wavelengths,
higher and higher frequencies, you really
start zooming in.
So, nature has been kind to us.
Vincent Fish: So I'd like to spend a moment
just talking about the basics of interferometry.
Interferometry gets a reputation for being
very difficult, and there are technical challenges
to it.
But in principle, it's very simple.
Many of you would be familiar from an optics
class or from using an optical telescope,
that if you have a parabolic mirror, then
all of the rays get focused to a point, all
the rays on a parallel incoming wave front.
And that's just due to the geometry of the
problem, the optical path links are the same,
no matter what ray you choose.
Vincent Fish: And that's true also, if you
remove segments from a mirror.
As long as those segments are on a paraboloid,
everything will come together and focus.
And now of course, you lose out, some of your
photons fall through the mirror, and the image
that you make at the end won't be quite as
good.
But the way that it's not as good is highly
predictable, and you can compensate for it.
Vincent Fish: Now, if you were to take those
segments and not keep them on a paraboloid,
but put them on the surface of the earth,
of course, they wouldn't focus anymore.
But if you can accurately time tag your data,
and if you can record both the amplitude and
phase of the data, then you can reconstruct
the telescope as though you were able to have
a gigantic paraboloid again.
So, in this case, for a very long baseline
interferometry, you're looking at some distant
object.
And by the time the radiation gets to you,
you can talk about planar wavefronts, and
there is a geometric delay that your signal
will get to one of your telescopes before
the other.
And that's entirely predictable, from where
you're looking in the sky, where your telescopes
are, what time it is.
And you can take that out, and you can combine
the signals.
And now your angular resolution is approximately
the wavelength divided by the baseline between
the telescopes.
Vincent Fish: So, you're not talking about
meters for a diameter of a telescope anymore,
you're talking about thousands of kilometers.
And to this day, very long baseline interferometry
and millimeter wavelengths, gives you the
finest angular resolution of all of the electromagnetic
techniques.
Vincent Fish: Now, it's not easy to do this.
So the telescopes that we use are on top of
mountains, for reasons.
And the reasons are associated with the atmosphere.
So the atmosphere wants to absorb the radiation
that is coming from the source, and it also
wants to emit thermal radiation and noise
at you.
And the plot on the left there shows the atmosphere
of transmission as a function of frequency,
and we're up at 230 gigahertz.
So on the left side of that, and this is for
five millimeters water vapor, which would
be considered pretty bad weather for Mauna
Kea.
But on a cold night in Boston, we're lucky
to get down to this.
And this is for Zenith transmission.
Vincent Fish: So, if you're looking low to
the horizon, as some of our sources are, then
most of your signal gets absorbed.
And so you want to be above that.
And the atmosphere also scrambles the phase
of the radiation that you receive.
So as I said, we record complex voltages,
so we get amplitudes and phases.
But because the atmosphere is introducing
this variable path length due to turbulence,
little clouds of water vapor going through,
on a timescale of seconds, the phase that
you think you measure has changed, because
there's this additional delay.
And that means that, if you want to detect
your source in the first place, you have to
detect it quickly.
So, you need a lot of sensitivity.
Vincent Fish: And so, it's expensive to build
telescopes.
But as I showed a few slides ago, there are
millimeter wave observatories already existing
in the world, and we can't make them larger.
So, how do we collect more photons?
Well, we've really been pushing toward more
bandwidth.
Because the more of the frequency range that
you're observing, the more photons and more
sensitivity you have.
And the digital revolution has really made
this possible.
Vincent Fish: So, back in 15 years ago or
so, much of radio astronomy was still being
done on magnetic tapes, with analog electronics
for your backend.
And that really set a limit on how far you
could push technology.
But now, thanks to analog to digital converters,
that are at very high speed, FPGAs can do
a lot of processing, hard drives that are
growing in capacity.
You can get on the [cot's 00:10:02] curve,
and really push your bandwidths.
And so, the curve here in the bottom right
shows the recording rate that the best VLBI
observations have been able to do, over the
course of the years.
And the EHT has really pushed on this, because
that's where your sensitivity comes from.
Vincent Fish: Another aspect of that is that,
some of the telescopes that we use and the
observatories we use actually consistent of
multiple telescopes.
And it's much better to combine the signals
so that you add all that aperture together
into one larger aperture, than to have to
record the signals from all of those antennas
and find your fringes to each of them.
Vincent Fish: So, here are two of the phase
to raise.
At the one in the bottom right here, is the
submillimeter array on Hawaii, and the Smithsonian
Astrophysical Observatory has built two generations
of [inaudible 00:11:04] processors for that,
which basically means that, when you point
at an object, you can get some of those signals
coherently as though you had one larger telescope.
Really, the breakthrough for the EHT has been
ALMA.
Vincent Fish: MIT Haystack Observatory has
led an international collaboration involving
many institutes from many countries to phase
up dozens of antennas at ALMA, which is at
one of the best sites on the planet, for millimeter
wave astronomy.
So, you get really dry conditions, you get
the equivalent of a 70 meter or 80 meter dish
up there, you get very stable conditions.
And that can serve as the anchor for the array,
making it possible to detect to all your other
stations, and then you can connect through
those detections.
Vincent Fish: One, after you take the data,
we send it back to the correlator.
So we have a correlator at MIT Haystack Observatory.
There's one at the Max Planck Institute for
radio astronomy and bond.
And the point of the correlation process is
to take those disc packs that we record on,
and play them back, inserting the appropriate
delays to make everything coherent.
And you just have to do that over and over
and over again.
You have to do that over large data rates,
you have to do that for every pair of antennas
in your array.
So, there's a lot of computation there.
So basically, the correlator is acting as
the lens or the mirror that brings all of
your signals back together for you.
Vincent Fish: And I should point out that,
if you are interested in seeing a correlator,
MIT Haystack Observatory is not far away.
We are a part of MIT, though we're not here
on campus.
And it's a nice visit, it looks purple to
me.
So, there's an open house at Haystack, May
2nd.
If you'd like to come out and hear about what
we do see some of the instrumentation, see
the correlator.
Please check out our website and contact us
and we'll reserve you your spot.
Vincent Fish: So, once you've taken the data,
of course, you have to analyze it.
And the data reduction and calibration process
is quite complicated.
And so I'm going to hand off to Lindy Blackburn,
who will tell you a little bit about that.
Lindy Blackburn: Thank you, Vincent.
I'm Lindy Blackburn, I'm at the Center for
Astrophysics.
I joined the EHT in 2014.
Michael was already there, and we were both
postdocs working on this project.
It's been really wonderful to see it through
to the first image.
So, I'll take a little bit of a step back
'cause I want to cover the story of how we
go through this 12 orders of magnitude and
data reduction for the EHT.
It's really an incredible amount of decimating
the data down.
And you might wonder why that is, and it really
owes itself to both the weakness of the signals
we're trying to detect, and also some of the
complexity of things like fitting out atmosphere.
Where we really need to record sensitively
on just seconds to be able to model these
effects.
Lindy Blackburn: So, if we begin here with
the correlation that was just covered, this
is where we really formed the virtual mirror,
and fundamentally transformed the data to
a different type, right?
It's raw signals from each telescope.
And then we're talking about measured correlations
between telescopes at correlation.
And we can average down, maybe reduce it by
a factor of 10 to three, 10 to four.
Lindy Blackburn: Calibration is, in some way,
a very similar process, where we're also modeling
largely what can be considered clock corrections
to our data.
We have very high frequency data, 220 gigahertz,
and so it's very sensitive to the timing of
that data.
But the essential difference here is that
in correlation, we're largely correcting for
a priori clock corrections.
It's a very careful and detailed model of
the earth delays, but still a priori enough
from the data itself.
And calibration, we want to fit those models.
We can actually take a chance to look for
the data, extract the signals, and we need
to build a model and fit it to average down
more, and maybe a factor of a million, as
much as a million.
Lindy Blackburn: And then, we use that data
to infer something about the source, like
the image.
And I purposely put in a pixelated and a simulated
image here, just to emphasize that there are
many, many fewer parameters that we can talk
about, about the source, versus all of the
systematics and such that we need to fit in
the underlying raw data.
So, that's a story of this process.
And I'm gonna walk through a little bit, starting
from the very beginning.
Yes, very interesting colors here.
Lindy Blackburn: I've been going to the LMT,
taking a break from my usual job about a month
out of each year for the last four years,
for commissioning and observing at the LMT,
bringing up for VLBI.
First, at one millimeter in 2015.
And finally, in 2018, we brought it up to
64 gigabits per second, which is double what
we record in 2017.
LMT is an amazing sight.
I have never been up on a mountain before,
so for me, it was really extraordinary.
It's a 50 meter dish, that's a humongous dish.
And it's the largest that can operate at one
millimeter, in the world.
Lindy Blackburn: I don't know if you can see
that small dark hole in the middle, that's
apex hole where the signals come through.
So for a sense of scale, this is looking out
through the apex of the telescope.
You see the primary, but you see the secondary
mirror.
Where the radio waves are reflected and they
come in through this hole, get bounced off
a few more mirrors, get onto the receiver,
a very small ... [inaudible 00:17:04] is gonna
be on the order of one millimeter also.
And throughout this whole process, the telescope
is constrained to a very high level of precision,
right?
The path lengths over this entire 50 meter
diameter have to be good to a very small fraction
of one millimeter, to be able to coherently
add the signals together, just to this one
telescope.
So, that's why there aren't too many of these
telescopes around the world.
And why the LMT, being the largest of it,
is quite complicated to operate.
Lindy Blackburn: Oh, I didn't mention this.
This is a large millimeter telescope, it's
in central Mexico, quite a nice place to go
visit.
It's an extraordinary site.
Lindy Blackburn: This is Daniel Palumbo, who
joined me in early 2018.
He was a MIT physics undergrad at that time,
helped commission the site in that year.
Now he's with us at Harvard, as a grad student
of the project.
And this is a picture of another backend setup.
Like what you saw from Vincent, this is a
[inaudible 00:18:03] signal, and digitizing
it and recording it to discs.
So, this is the process.
It's rather extraordinary.
We can do most of our processing just using
a two-bit four level representation of the
signal Mike has sampled.
So you see these little dots on the bottom,
that's what we're recording onto disc from
the original sky signal.
And that's what's correlated.
And from what we derive our images are just
those four levels of random noise, essentially.
They save to hard drives.
Lindy Blackburn: And I might not need to introduce
Katie Bouman to you.
But she is also an MIT graduate student, and
then was working with us shortly.
She's actually been with the project longer
than I have, so she's joined really early.
But she's also come with me for the last few
years at LMT, for observing there and helping
with the setup.
Lindy Blackburn: So, this is a picture of
about half a petabyte of data from that one
site.
And it gets combined with all the sites, forming
several petabyte data set you might have heard
about.
Vincent told you a lot about the correlator,
and this is where the signals are combined.
And so, this is a fundamental process.
Lindy Blackburn: Correlation is, in principle,
very simple, right?
This is a correlation coefficient equation
you can get on Wikipedia.
It's just, you multiply two signals, you average
them together.
Here's normalized by the noise power in those
signals.
Lindy Blackburn: So in our case, I only juice
six stations, here.
We had eight stations in 2017.
And then we measure the correlation across
all pairs of antennas.
One of the important details is that, R, the
correlation coefficient is very small.
It's something like less than 10 to minus
four, maybe 10 to minus six.
So what that means, you need to average billions
of samples to actually extract something out
of your noise here.
And so, the key is, how can you average those
billions of samples?
And one of the things that makes that difficult
is that, each telescope is subject to its
own special systematic.
So here, I tried to represent some atmospheric
turbulence.
But the radio waves are propagating through
the atmosphere, and they have some unknown
delay, equivalently unknown phase, that's
actually varying pretty rapidly with time.
Lindy Blackburn: It creates on the signal,
something that can be interpreted as a complex
gain factor.
This is changing the amplitude of the signal,
making it smaller most likely, and changing
the phase of the signal.
So, giving you a little bit of ignorance about
exactly the timing of a signal at that one
site.
And that means that you can't average longer,
then that gain is stable.
So that's why we need to model these g-parameters
before averaging the data down to something
sufficient, to make imaging practical.
Lindy Blackburn: I'll just go back.
One more thing.
It's a difficult problem, but it really saves
us for VLBI, is that, you see that this g-factor,
it affects each station individually.
That means there are something like on the
order of the number of stations, that's how
complicated this problem is.
Where, as we add stations to the array, you
can see the number of measurements we get
grows as the squared number of stations, right.
That's how many pairs of stations there are.
Lindy Blackburn: So eventually, you add enough
stations, and this becomes over constraint
problem.
Calibration becomes much more, maybe not easier,
but you become much more confident in it.
And we can play this game and try to fit all
these things out and get our data at the end.
Lindy Blackburn: All right, so moving a little
bit on to 2017.
In 2017, we've done a lot of things for the
first time.
We brought the EHT to its say, first generation
complete array, where we think we can make
images at one millimeter.
For the first time, we introduced ALMA, the
character of the data change a lot from previous
commissioning years and early science observations.
And we didn't want to be very careful about
how we process this data, knowing that it
would receive a lot of scrutiny at the end
of the day.
Lindy Blackburn: So, we developed three reduction
pipelines.
They operated more or less independently on
the data.
They used different underlying software HOPS
here that I've worked on.
HOPS stands for Haystack Observatory Postprocessing
System.
And then ARPA [inaudible 00:22:02], designed
by [Mikael 00:22:04] Johnson at [inaudible
00:22:05].
Based on CASA, which is a very powerful radio
astronomy and reduction suite.
And AIPS, which is also a classical reduction
suite from NRAO.
Lindy Blackburn: And I'm not gonna explain
all these little boxes here.
But essentially, they're building out this
clock and phase model as the data processed,
and as things can be solved for more precisely.
Until you get a pretty good model at the end,
and you can create this output data set.
Lindy Blackburn: So we had these three operating
independently.
This is just an example of atmospheric phase,
'cause it's so important.
And also, to highlight the fact that having
a central anchor station like ALMA, is extremely
useful for this process.
With ALMA, it can essentially phase-lock the
array, because it provides such a high signal
to noise to the other sites.
That not only benefits the measurements to
ALMA, which are relatively easy, but measurements
between those other sites, which can be hard.
It can be challenging, especially in bad weather.
So this shows something you can get before
or after you've phase-stabilized with ALMA,
showing just how the strength of the coherently
average signal, over the course of say several
minutes scan, become stronger and much more
stable, we get a much cleaner measurement
after we can phase-lock the array.
The whole entire array with the ALMA phase
three.
Lindy Blackburn: So, we've generated this
data, there's also an extensive amount of
cross comparisons and validation.
I mean, this was one of the major reasons
we had the three pipelines in the first place.
We also have different ways of splitting up
our data and time and frequency.
And we check self consistency.
So, these are just histograms representing
self consistency of the data across pipelines.
And in this particular case, things are fairly
consistent with their statistical errors.
So, the thermal noise.
And that's really where we want to be.
Lindy Blackburn: It's not always a case that
we expect some residual systematics, in which
case, we fit those.
They're about a degree and phase about at
one percent amplitude.
So, we know that those systematics become,
after we fit everything else so far, the residual
systematics that we're passing down to imaging,
they'll become important when the signal to
noise of our measurements gets beyond 100
or so.
Lindy Blackburn: So, one important independent
piece of information we have.
I haven't covered it all, and it's not really
relevant to the destination of the data.
But that's our independent estimate of station
sensitivities.
So, we call this a priori calibration, flux
calibration.
This is done at the telescopes.
Each telescope has some idea of how much collecting
area, it has some idea of how much noise it's
subject to from the atmosphere and the receiver.
And this is information that's very hard to
really pin down precisely, because you don't
have a good ... You just have one little antenna,
the signal's very weak.
You don't get quite the clean measurement
you do from correlating data across two different
sites.
But we do want to bring this information in.
It's hard because we've built up all these
different telescopes built by different people,
and of all different types around the world.
And trying to combine this into a systematized
way, of characterizing insensitivity, is quite
complicated.
Lindy Blackburn: This huge effort was coordinated
by two [inaudible 00:25:15] graduate students
from Netherlands, Sarah [Song 00:25:17], and
Michael [Johnson 00:25:18].
So, I wanted to highlight their work here.
Lindy Blackburn: So, one of the reasons this
is important is, before that, the correlation
coefficients we measure, they're really in
units of noise of the telescopes.
They're not in something we can directly interpret
as flux, correlated flux on the sky.
But once we know what is the intrinsic sensitivity,
we can then bring that on to a physically
relevant scale.
So, this part just shows the building of the
data.
You might probably know about Earth rotation
aperture synthesis now, Michael will probably
talk about it more later.
We're measuring spatial frequencies on the
sky, you can see how those frequencies are
built up over time, or over the night.
Lindy Blackburn: But on the right here is
really what I wanted to show, which is our
calibrated visibilities on the x axis is the
baseline length.
So, we've projected what's a 2D thing onto
just this one D representation of baseline
length.
Lindy Blackburn: What we can see here is that,
it's remarkably simple.
And we're extremely happy once we saw this,
right?
It tells you that, even though we've looked
all sorts of orientations, the profile here
is fairly similar for different baselines
of the different colors.
And you know this isn't a formal fit to the
data or anything, but this is just what you
would expect for the flux density, similar
to the signals are, versus baseline length.
It's just a simple model.
It's just a thin ring on the sky, perfectly
symmetric.
Lindy Blackburn: And you see this, this is
a nice bounce.
And so, this is tantalizing evidence that,
something like a ring would be a very good
fit for our data, once we go through the entire
data processing, and analysis and imaging
the pathway.
Lindy Blackburn: Finally, there's another
interesting thing that can be seen, just in
the raw data itself.
And this is using something that we call closure
phase.
It's a little complicated, but essentially,
we're summing the baseline phases of a closed
triangle in array.
As you can see, from the deltas, this is supposed
to represent the extra propagation delay or
something from the atmosphere over each station.
You can see that, if you sum it out over a
closed loop here, those cancel.
So, you're left with only the closure phase
that is imparted from the source.
So that's a very nice quantity to have.
You don't need to model anything about the
atmosphere to actually get this, you just
need to be able to calculate this closed loop.
Lindy Blackburn: And what you see here are
closure phases for a variety of triangles.
It's gonna be hard to see I think, all the
colors.
But we have the four days of observations,
and Vincent brilliantly, schedule these observations.
So we have two days in the beginning, which
are right next to each other.
Two days at the end, which are right next
to each other.
But a few days in between.
Lindy Blackburn: And what we see is, on some
triangles, there's a dramatic difference in
closure phase, between those first two days
and the last two days.
Well, within those two days, that are right
next to each other, we see pretty self consistent
closure phases.
So, this is really interesting.
It tells us that the compact structures is
evolving over those four days.
It's not unexpected.
It's not unbelievable for a seven billion
solar mass black hole, it's on the order of
the dynamical timescale.
But it really gives us the exciting possibility
that we could eventually make movies of this
kind of evolving structure over time, if we
[kins 00:28:39] our observations correctly.
Lindy Blackburn: So this is where the data
story ends.
I'll leave it to Michael to really show you
how the imaging is done, and what the science
we can extract is.
Michael Johnson: Great.
Okay.
So before we dive into imaging, I just want
to give you a sense of why this was such a
tricky problem for us.
So, we're trying to make a picture of something
that no one has ever seen before.
We're trying to do this with data that's extremely
difficult to calibrate.
And we're doing this with an instrument that
no one's ever used before.
Okay.
So we're pointing up at the sky and we're
looking with new eyes.
There's nothing we can look at and say, "Oh,
yeah.
That looks like what these other people got.
So, our instrument must be okay."
Or, "Oh, yeah.
That looks like what we got last year, it
must be okay."
Everything is for the first time.
Michael Johnson: And so, it's this incredible
challenge of, how do you develop confidence
in a result in this environment?
And how can you develop confidence in something
so profoundly new, as a picture of a black
hole?
That's what I'm gonna walk you through a little
bit.
Michael Johnson: So, the first thing that
we decided to do was that, we were gonna break
our imaging up into four separate teams.
So we said, before we do any imaging at all,
we're gonna divide up the data that Lindy
sends us.
At this point, this is data that can just
be emailed, it's a very small amount of data.
So it's divided up among four teams.
And we said, we want them each to make images
blindly, okay.
And this isn't just an imaging algorithm that's
blind, it's actually the entire process of
considering what the data are telling us.
Michael Johnson: So, some groups might say,
"Oh, I think this station has really poor
performance."
And other groups might say, "I think that
station's fine, and that those are data properties
of the source."
And other groups might say, "We're gonna flag
that data out.
We're not even gonna touch it, because we
think it's contaminating the results."
Another group might say, "Well, we think we
can calibrate in a particular way."
Each group makes its own algorithms, each
group picked its own choices.
Michael Johnson: So, we've got the data.
And we spent seven weeks, with no communication
at all among these four teams.
And then we got together at a workshop, and
we compared the first images that the teams
produced.
And I'm so delighted to share with you the
first images ever with our new plum color
map, that no one has ever seen before.
It's actually quite striking.
We actually spent months discussing this;
how do you visualize these images?
There's no color in our images, it's just
brightness.
So, actually, the most faithful way to show
it is just on grayscale.
And so, we went to great trouble to make things
perceptually uniform and all these things.
And it's striking to see how different the
images look, when you recolorize them.
Michael Johnson: In any case, so we came together.
The four teams revealed their results.
And we saw each team had reconstructed a ring,
they all had about the same size, and they
were all brighter in the bottom.
And that's when we knew we really nailed this
thing.
We knew that there was an image of a black
hole in the sky, and all we had to do was
turn this into a final image to refine the
algorithms.
Michael Johnson: This is a picture of the
first imaging workshop.
Enlightening.
And that is the first image of a black hole.
And I just wanted to highlight a few of the
key contributors here.
Andrew Chael and Kazu Akiyama, both developed
imaging libraries as graduate students.
So Kazu is now a postdoc at MIT Haystack,
and Andrew's just finishing his PhD at Harvard.
But we use three different imaging techniques
in our final results.
And these two members wrote the software,
they led the development of the software for
two of them.
It's really incredible work.
Michael Johnson: In addition to that, Daniel
Palumbo, you've seen before, and you'll see
a lot of familiar faces in here, including
Lindy.
It's a small project, we all do everything.
And it's just a huge amount of fun.
Michael Johnson: Sheph Doeleman is the director
of the project.
He was at MIT Haystack for a very long time,
and was the assistant director there.
He's now at the Center for Astrophysics.
Katie Bouman is an MIT graduate from [inaudible
00:32:52], who's had a huge role in these
results.
And we even got some new friends along the
way.
So Peter Galison, who's here today, he's a
Harvard faculty in philosophy, actually in
history of science and physics.
And I think he came into this expecting that
he was gonna make a documentary.
But by the time we got to the workshop, he
had his sleeves rolled up, the laptop out,
and he's working away with the rest of us.
Likewise, with Ramesh Narayan, a theoretical
astrophysicist.
Michael Johnson: There's something just really
special about this problem.
You have a few hundred measurements, and in
those measurements, is hiding the picture
of a black hole.
And it's this really pure mathematical problem
of, how do I turn my measurements into this
picture?
And how do I track that and understand what
I'm doing.
So, a lot of people are being captivated by
this project.
Now, I just want to give you a sense of what
it looks like to turn these data into a picture.
Michael Johnson: So, at each point, if you
have some data, you have two goals.
You want to make a picture that's consistent
with the data you have, and you want to make
a picture that's maximally conservative with
respect to all the information that you don't
have.
So, this is a snapshot of that video that
Lindy showed you.
This is just one baseline on the earth joining
those two stars.
And in the center, that's the baseline coverage
you have.
So, the points are the information you have,
and where there aren't points is all of the
unknown.
There's this desert of unknown that you're
just sampling a few points in.
Michael Johnson: So, in this particular case,
if we take our imaging algorithms, what we
get with just this one baseline, it's just
a big blob on the sky.
We're basically sampling one spatial mode
on the sky.
We know how much power is there, so that tells
us how big this thing is, and we don't get
anything more.
Michael Johnson: Now, we had a second baseline.
The second baseline adds more information
at different modes.
It adds a little bit of high frequency information.
And so, that breaks this blob symmetry.
It tells us that there has to be some small
scale structure in the image.
And you see that it splits and it starts to
become more intricate.
Michael Johnson: Now, we had a fourth site
and now we have six baselines.
As Lindy said, the square is roughly with
the square of the number of baselines and
times and minus one over two.
So it goes rapidly as you add more and more
sites.
And so, it's the type of project where you
have nothing, nothing, nothing.
And then all of a sudden, boom, you have an
image.
And it's really exciting, just to see where
is that point.
And for us, that was in 2017.
Michael Johnson: So here, now you can see,
we start getting information at different
position angles.
They're pointed in different directions.
Now all of a sudden, our image is becoming
more intricate.
You can see this ring, it's hiding in there.
So, we add one more site in Spain, and then
all of a sudden, the sharp ring pops out.
And so, that's the amount of [foye 00:35:31]
information that you can mask out almost all
of the foye plane, but what's left is still
enough to get you to that picture of a black
hole.
Michael Johnson: What does this mean though?
How do we use this to inform our knowledge
of black holes?
What does it mean for general relativity?
What does it mean for science?
And to understand that, you have to understand
why a black hole has a ring in the first place.
So, this animation shows that, it turns out
that a light propagating near a black hole
doesn't follow straight lines.
It's curved, because the black hole is curving
space time.
Light can even wrap around the black hole,
it can go in circles around the black hole.
Michael Johnson: So, what happens is, if you
light up the space time near a black hole,
you have lots of photons flying everywhere.
The ones that are too close to the black hole,
or the ones that are pointed at it, fall in.
The ones that are further away, get pulled
around it.
And the net effect is that the black hole
is casting a shadow on that bright surrounding
emission that's almost perfectly circular.
Michael Johnson: So, let's say that we want
to turn this into a test of general relativity.
Well, the first thing we need to know is,
what's the size of that circle in the sky.
It turns out mostly that depends on the black
hole's mass, bigger black holes cast bigger
shadows.
It's just proportional to mass.
Michael Johnson: Now for M87, we have a couple
of handles on what the mass might be.
One, is to look at the motion of gas on the
scales of tens of thousands of [inaudible
00:36:56] yield [inaudible 00:36:57].
So, pretty large compared to the size of the
black hole.
Michael Johnson: These suggested, before the
EHT measurements, that the black hole had
a massive about 3.5 billion solar masses.
On the other hand, if you look at stars, they
also tell you right there, zipping around
something there, you can't resolve individual
stars.
But by their bulk motion, you can estimate
that the black hole might have a massive [inaudible
00:37:19] 6.6 times 10 to the nine ... 6.6
billion times the mass of the sun.
These are enormous numbers.
Michael Johnson: So, the other thing is, these
are very large scale measurements.
There's nothing to say that there's not something
in nature that pops up and prevents a black
hole from forming somewhere between these
huge range of scales that you're probing.
Something like a factor of 10 to the four.
From the scales, where you're actually sampling
what's living inside some sphere, and you're
actually seeing it directly.
So, you might expect that the stellar orbits,
six billion solar masses, a black hole might
actually live there, and it could have less
mass than that, and other stuff could be distributed.
But it couldn't have more, or you'd see the
orbits reflect that.
So, if there was a black hole, and it had
all the mass that you're expecting stars,
and the black hole isn't spinning, that's
the size you'd see on the sky.
Michael Johnson: Now, it turns out, if a black
hole is spinning, it makes it a little bit
smaller.
So that's the size of the shadow for a maximally
spinning black hole, with the 6.6 billion
solar masses.
And so, this little range in between those
two, which isn't much of a range that tells
you the range expected for a canonical gr.
That's if Einstein's right and black holes
behave in the way that we think they do.
Michael Johnson: Now, if instead the black
hole has the 3.5 times 10 than nine solar
masses, then you get a smaller ring.
So, easy check.
We can see whether one or the other of these
is correct.
But just for fun, what about other things?
So, it turns out, wormholes, it could be a
wormhole in the middle of M87.
Who's to say it couldn't?
And it turns out that wormholes have a smaller
shadow than black holes.
So, here's the size of a wormhole.
If the mass is 6.6 times 10 to the nine solar
masses, let's push beyond that.
What if it's something exotic and gr, but
there's no event horizon?
What if it's a naked singularity?
So, it's a singularity in space time, but
it's not shrouded by an event horizon and
around it.
And it turns out that those cast a different
shadow still, a much smaller shadow.
Michael Johnson: So, we have all these predictions
for what the shadow might look like, and then
we can sit the EHT image on top of those.
And you instantly rule out all of these exotic
possibilities, and you instantly rule out
the gas dynamical measurement.
And we're left with a stellar dynamical measurement.
But more than that, we've said that all of
that mass that the stars see is concentrated
within a tiny region.
Within the region where photons would go in
circles around the material.
So it's really quite a stunning confirmation
of gr.
Our results are perfectly consistent with
the predictions of GR.
Michael Johnson: But we can push beyond this.
So this is the image we see, but we think
that lying beneath it is a much more complicated
system.
And we're limited by the resolution of our
array.
And we think this system is alive.
It's a intensely dynamical cauldron of hot
gas that's hundreds of billions of degrees
that spinning around this black hole.
And some of this gas gets pulled in, and it
goes through the event horizon.
Some of it flies out on jets.
But what we think now is that, the black hole
is the driver of all of that dynamical activity.
Michael Johnson: Now, this is happening fast.
So, this is a black hole big enough to engulf
our entire solar system.
But this black hole, we would see it evolve
on timescales of days.
So, in this particular simulation, one second
corresponds to two weeks in real time.
So that's what Lindy said, we see in our data,
we don't even need images.
We see that the system is evolving before
our eyes during an observation.
And what can we do with that?
Michael Johnson: So, I think we have this
interesting preview.
If you go to seven millimeters, so this is
a little bit longer wavelength, and it turns
out that the jet isn't translucent of that
wavelength.
When you look at it, you're not actually appearing
all the way down to the black hole.
So, the jet is opaque, and you see this beautiful
extended jet.
And you take observations over many months
and you can patch them together.
And you see that this jet is alive and it's
riving.
The black hole is expelling huge amounts of
energy that are just rippling down the length
of this jet.
That's the EHT image.
And you can see the size of it when superimposed
on this image, which is one of the highest
resolution images of M87 ever made before
the EHT.
So, what we know now is that, the black hole
really is lying at the heart of that system.
Michael Johnson: And what's most interesting
is that the EHT simulations that we have,
these numerical simulations that are consistent
with the EHT results, suggest that the energy
in the jet is actually not coming from the
material that's falling on the black hole.
It's coming from the black hole itself.
Once matter falls into a black hole, it's
gone forever, you can't get that back.
But it turns out that if a black hole is spinning,
you can slow it down and pull the rotational
energy from the black hole.
Michael Johnson: So, if there are magnetic
fields threading the black hole, it drags
them along with it.
And it winds them up like a motor.
And that's exactly what we think is happening
to this jet, and it's expelling huge amounts
of energy.
Michael Johnson: If the black hole in M87
is spinning rapidly, it has enough energy
that you could extract to power a billion
suns for the age of the universe.
So, this is an incredibly powerful engine.
And we're just getting started understanding,
watching it live and breathe.
And understanding the role of the black hole
in these systems.
Michael Johnson: So let me just close by saying,
this is a huge collaboration.
It's just such a joy to be able to share the
results with you, and MIT has just been a
huge role in this project.
So, thank you.
Joe Paradiso: Okay.
So 
I think we're going to evolve into a panel
here.
So our speakers will be seated.
I'm gonna kick off with a few questions, and
then we'll take questions from the audience.
To avoid the glare of the lights, maybe I'll
stand here, at least for now.
Let's see.
Joe Paradiso: So, looking at these things,
actually seeing them a false color, does it
really train people that it's not really that
color, right?
These are radio waves.
And we're looking at them in terahertz.
But what actually are these photons?
Are these hard x-rays or gammas that have
their gravitation shifted down?
Michael Johnson: So these are actually radio
waves being emitted by synchrotron radiation.
So these are electrons that are gyrating very
rapidly around magnetic field lines.
And they're sending out broadband.
They're not emitting lines, they're just sending
out swats of energy across huge wavelengths.
And the red shift of these electrons that
are actually emitting, is actually fairly
modest.
It's actually x-rays, but they're red shifted
down the radio.
Joe Paradiso: Okay.
And what about Doppler?
Because the thing is whipping around.
So I would imagine that you're gonna see real
shifts and frequency, depending on what direction
they're coming and where they're coming from.
But if it's broadband enough, that may not
affect your detector, you're just accepting
everything.
But is Doppler effect that you have to account
for, like you would in a regular observation,
or not?
Vincent Fish: Well, it is, but in a different
way.
So, the material that's emitting is moving
at a substantial fraction of the speed of
light.
And relativity predicts that when that happens,
the emission gets beamed in a forward cone.
And so if you look at the images, they're
asymmetric, there's one side that's brighter
than the other.
And so, that tells you that the approaching
side of the equation flow is the bright side.
Vincent Fish: Now, of course, we're seeing
this in perspective, the three dimensional
spin vector is pointing out toward us, but
not directly at us.
It's off to the side of it, 15 or 20 degrees
or so, since it'll be complicated to put up
there.
But you can definitely see that the Doppler
effects combined with relativity make a big
[crosstalk 00:45:21].
Joe Paradiso: This is like stellar aberration
essentially, where the whole universe is just
dark, pushing the speed of light.
A blue point in front of you.
Vincent Fish: Yes.
Lindy Blackburn: Yeah.
Also you mentioned Doppler.
And you remember the simulation that Michael
showed with the dynamical M87, a very high
resolution, it's a beautiful simulation.
And so when those are created, when the array
tracing is done, to actually see what it looks
like, the Doppler effect is inverted.
And when we sample the emission it sampled,
including all of those effects.
Joe Paradiso: You mentioned increasing resolution.
I mean, it's amazing to see this, everyone
wants to see something like the animation
at some point.
Of course, that's a tremendous challenge.
But what actually would the challenges be?
Is it [lambda 00:46:03] over d?
Is it because you have to integrate over days
to make this image, so you have motion blur?
What are the things that are limiting us?
And what do you think we can overcome?
Michael Johnson: So, there are of couple parts
of that.
These images are diffraction-limited.
So you can see these circles on them that
tells you the instrument beam, the point spread
function.
And so you can see that these are actually
convolved with that beam, so we're left with
only the things that we absolutely trust.
So the way to beat that, there are two ways.
One is longer baselines.
So we're thinking about how to push this into
space.
If you could launch a couple of orbiters or
you could put something on the moon.
There are many things that one can think of
that would allow you to push to much higher
resolution.
There's even an active orbiter called RadioAstron,
that observes with baselines that go out to
the distance of the moon, just at longer wavelengths.
So, we know that this is technologically feasible,
it's gonna happen.
It's not gonna happen today, but it's gonna
happen soon.
Michael Johnson: The second thing that we're
doing is, we're looking at higher frequencies.
So, [inaudible 00:47:07].
Vincent, you want to talk about that recent
...
Vincent Fish: Yeah.
Well, there is a push toward observing at
345 gigahertz, which is the next atmospheric
window.
That's a factor of one and a half higher in
frequencies.
So, your angular resolution improves by a
factor of one and a half.
Joe Paradiso: Great.
And that's been demonstrated.
Are any space missions planned at this point,
specific missions?
Or is this totally on the drawing board?
Michael Johnson: Well, it's somewhere in between
those two.
We're working on.
Joe Paradiso: This will add a little bit of
momentum to it.
Lindy Blackburn: Well, also, we're not just
stuck, we haven't filled the earth.
So one thing is, the 2017 results, they had
this pretty sparse array still.
We just had the five geographic locations.
In 2018, we had a Greenland.
And we were hoping next year, to add another
site in France, and another in Arizona, in
Kitt Peak.
So, this gives us longer baselines, which
are very exciting, the image will become sharper.
It also gets us short baselines, which are
also exciting.
Because then we might have a chance of seeing
the extended jet.
And then we'll see how this black hole actually
links up with this massive object that actually
goes for tens of thousands of light years.
Joe Paradiso: That relates to another thing.
When I got the picture, I did what some of
us may have done.
I put it into Photoshop and I just pulled
the curve.
'Cause I saw these hints of [inaudible 00:48:29]
things that, at 45 degrees, you seem to see
some hazy cloudy patch.
Is that the beaming of the jet?
Is it a artifact?
Or is it something between the earth and there?
What is this stuff?
Michael Johnson: Those are artifacts.
So, this is the limited UV coverage.
We're just stretching our algorithms to their
limit.
We can suppress those artificially, but we
chose not to.
We want to keep a floor that represents the
uncertainty in the image, and not impose too
much of our own will on what it should look
like.
That's the stuff that should sharp.
I mean, our dynamic range right now is 10.
It's not much.
Joe Paradiso: I'm sure you had a lot of people
trying to ask these questions about these
things, but I suspect that they were artifacts.
But we'll see the real extra phenomena soon
when you-
Michael Johnson: I mean, make no mistake,
we spent months wrestling over these questions.
Joe Paradiso: Yeah.
Of course.
This is the question everybody has asked you,
and probably we continue to ask.
This is wonderful.
You get this big, huge black hole within range
that we can do this.
But there are lots of other things.
So, the Sagittarius A-star, which you guys
had talked about, that's the one here in the
Milky Way.
It's not so active.
It's not like this is flowing all the time,
fortunate for us perhaps.
But we'll be able to see that.
Joe Paradiso: How about Cygnus X-1, that was
the poster child of black holes.
It's only 6000 light years away.
Could we try to look at that?
Is there a possibility it's smaller but it's
close?
And then there are weird things.
I mean, this is radio astronomy, not just
for black holes, but for any kind of admission.
What other kind of things can you look at?
People talk about Tabby's star, I don't know
if that emits at these wavelengths.
But we can maybe see what's causing this really
weird, irregular profile from the emission.
So, what kind of things you're gonna look
at, or what kind of things can we look at?
Vincent Fish: So that's really three excellent
questions.
And I think that we should take those in turn.
Joe Paradiso: It's three people.
Vincent Fish: Sagittarius A-star is one of
our primary targets.
It turns out that it's a harder source to
observe because it changes so quickly.
So, when Michael showed the video of the simulation
of M87, you could see that on the timescale
of days to weeks, the source structure changes.
But each of the images that are up on the
screen, are from one night of observing.
So over the course of a single night, the
image is effectively static.
Vincent Fish: Now for Sagittarius A-star,
the source changes on a timescale of minutes.
And you can just point a single dish out and
you can see it flicker.
And so, if the material is moving around it,
then when your baseline is ... When you have
data from two telescopes now, you get a point
in the aperture plane and the UV plane.
And then you wait for the earth to rotate
so you get different points.
Well, now those are from different images,
and so, you get a blurring.
And Michael has actually led an effort to
try to figure out how to do dynamical imaging
from sparse data.
Do you want to say anything about that?
Michael Johnson: Well, I mean, it's the excitement.
It's the challenge and the excitement.
A day for M87 is a minute for Sagi A-star.
And so, we really do have the chance to watch
Sagi A-star evolving before our eyes, and
to see it.
It's just a computational challenge, we need
more information.
But we do.
We don't want to reconstruct images, we want
to reconstruct movies.
And we're just scratching the surface of what
can be done there.
That's where I think, a whole new community
of scientists can bring a flood of ideas and
make this happen.
Joe Paradiso: And how about the smaller black
holes that are closer?
Cygnus X-1 and other sources that may not
be black holes?
Vincent Fish: The angular size of the event
horizon is proportional to the mass of the
black hole, divided by the distance to black
hole.
And there are black holes that are a lot closer
than the center of the galaxy, but those are
stellar mass black holes.
So tens of solar masses, and that's a factor
of 100,000.
And so, if you had one of those sources that
was gosh, a per sec away or less, then you
could do that.
But that might cause other problems.
Joe Paradiso: Yes.
[crosstalk 00:52:34] in the sweet spot.
It's good they aren't there.
Vincent Fish: Yeah.
You're not gonna get the same angular resolution
on them.
Vincent Fish: The third question you asked
was on other types of sources.
And because the aperture isn't very well filled
for a very long baseline interferometry, you
really need sources that have high, what we
call brightness temperature.
Vincent Fish: So effectively, if you had a
black body there that were emitting that thermally,
how hot would that need to be?
And the answer is up around the 10 to nine
Kelvin range, billion degrees.
Shorter baselines don't require as high of
a brightness temperature.
So, you're really talking primarily about
very hot things and non thermal things.
So, in addition to active galactic nuclei,
you could do this with some astronomical masers.
Those are spectral lines sources.
Lindy Blackburn: Also pulsars.
Vincent Fish: And pulsars, yeah.
Joe Paradiso: Finally, it's a really exciting
time in physics, with all of these new windows
in the beginning.
So, the best is yet to come.
But so far, you guys have been confirming
Einstein.
I mean, LIGO confirmed Einstein and you guys
did, and it's amazing.
But his shadow [rooms 00:53:51] very far.
We have quantum gravity and all these other
theories that attempt to push her forward
and it'd be great to see something that points
to a new theory.
Or at least in something that can't be easily
described, like dark energy or dark matter,
coming out of this.
Joe Paradiso: And CERN, of course, it's just
verified the standard model with a haze, which
is a great discovery, but there's no new physics.
So, what do you think about the signs of new
physics?
You looked a little bit at rotating black
holes and signature [inaudible 00:54:19] anomalies?
What things would you maybe look for?
Or is there hope we could see one?
I guess there's always hope.
Lindy Blackburn: Sure, yeah.
So, it's always interesting to do a test in
a completely different regime.
So, we're thinking here is, it's the first
time we have direct evidence and a direct
view into a 10 or nine scale billion solar
mass black hole.
The types of tests you can do with this image
versus gravitational waves is different, each
is testing some deviation within its own domain.
Lindy Blackburn: But as you can see, it's
a blurry image.
We're not gonna have a very precise test of
gr at this stage.
And we'd be very surprised if we thought whatever
we can constrain to the 10 or 20% level would
really be enough to really pin down some small
deviation to the theory.
Michael Johnson: I also think, the crown jewel
of this is going to be Sagi A-star.
So for M87, we don't see individual stars
going around it.
We had this roughly factor of two uncertainty
in the mass.
And if we didn't, then our current measurement
accuracy might have been good enough to estimate
the spin of the black hole.
You know, that narrow range of possibilities.
Michael Johnson: Now, Sagi A-star is different,
it's close to us.
We see these stars zipping around it.
You've probably seen these movies, these incredible
movies, tracing stellar orbits and seeing
them just be whipped around by an absolute
void.
And because of that, we know the mass and
distance of Sagi A-star perfectly.
I mean, to exquisite accuracy.
And so, if we can extend that EHT, then we
can really push that.
Michael Johnson: We know the gr breaks down
in these regimes.
It's fundamentally incompatible with quantum
mechanics.
There has to be new physics living here.
I think it's very much a stretch to think
that you could see that with this resolution.
But with Sagi A-star, there's such a tight
prediction for what you would see that I think
that that's something that over the next few
years, as we match these images with our own
galactic center, that's gonna be really exciting.
Joe Paradiso: Man, you got my hopes up.
Hope to see you in Stockholm.
Joe Paradiso: Anyway, we have permission to
go a little bit longer actually here.
So, maybe 10, 15 minutes, if there are questions
in the audience.
I'm sure there are.
So, let's pass the microphone around.
Put your hand up if you would like to ask
our experts any question?
Got to be somebody here.
Speaker 5: Right here.
Joe Paradiso: Oh, there we go.
Speaker 5: Hi, thank you.
Does the fact that the mass estimate for M87,
agrees with the stellar estimate, rather than
the gas estimate?
Does that mean that all the other black holes,
we only know their masses by their gas dynamics,
are maybe also off by a factor of two?
Does that matter?
Michael Johnson: I do think that there's a
lot of reason to question those now.
Even our own galactic center, there was a
factor of two discrepancy within the last
20 years.
And these are not trivial models.
It's not like you just take, for these, to
estimate the mass of a black hole, you can
just read it with a ruler.
There's nothing complicated here.
To infer the mass of a black hole on these
much larger scales, using the dynamics of
the gas, that's a much more complex process.
So, I think this does push, hopefully, it
will push these communities to develop better
models, and understand how to make them more
generalizable.
Joe Paradiso: Okay.
There's a question down ... Oh.
There's one down here.
Speaker 6: Roughly speaking, relative to the
plane of those images, where does the rotation
access point?
Michael Johnson: So, it's really confusing
that the jet, you see it going off to the-
Speaker 6: The jet is allowing this rotation
access, yes?
Michael Johnson: That is a thought but it
is.
Although that doesn't have to be the case.
So, this jet is actually pointed almost straight
at us.
It's tilted off about 18 degrees.
Speaker 6: 18 degrees is enough to make that
sharp contrast between the bottom half and
the top [crosstalk 00:58:35]?
Michael Johnson: That's right.
That's why you only see one side of the jet.
The other thing, you see it right at the base,
and then it gets accelerated.
And so, it's pointing away from you.
[inaudible 00:58:42] Doppler boosting, so
you don't see it.
You only see it when it's going slow.
Michael Johnson: But what we know from these
images, Vincent was talking about the bright
on the bottom.
That's where the stuff's coming towards you.
So, you can imagine that this thing is twisting
towards you on the bottom, and it's going
away from you on the top.
And so, this jet is just a little off the
line of sight, and the spin vector is pointed
away from us.
Joe Paradiso: So if we were to zoom out, we'd
see an accretion disk form pretty much around
that donut, because of the access is pointing
toward us.
Michael Johnson: But it will be weak.
And so you need the dynamic range to be able
to manage it.
Joe Paradiso: Yeah.
You're not emitting as much there, it's right
close to the [inaudible 00:59:21].
Michael Johnson: Yeah.
So, we actually don't know what's causing
this emission.
It could be the disk, it could be the jet,
it could be some part in the middle that's
both disk and jet.
This image is dominated entirely by the gravity
of the black hole itself.
And so, that's why it increased dynamic range.
Joe Paradiso: It's just electrons being bent
as they whip around essentially.
Michael Johnson: Well, photons.
Yeah.
Joe Paradiso: Okay.
More questions.
Speaker 7: As you start putting more of the
telescopes on two satellites, and putting
them in space and you get outside of the transmission
window of the atmosphere.
What's the limitation on the frequency you
would look at then?
Vincent Fish: So, there isn't necessarily
much of a limitation on that.
But it depends on the approach that you take.
So, if you're contemplating a space VLBI mission,
you have to ask, "Do I want to do interferometry
just between the satellites in space?
Or do I want to do space ground and interferometry?"
And it is a lot cheaper to leverage the large
apertures on the ground for your sensitivity
than to put large things up in space to provide
equal sensitivity.
So, there is incentive to keep common frequency
bands with the ground.
But in principle, for space only baselines,
you're pretty much unlimited on frequency.
Michael Johnson: Well, yeah.
So the receiver technology to do the heterodyne
things, we're actually recording this full
coherent wavefront, that's tough to get up
to a terahertz even.
There's no reason that couldn't go higher.
And a terahertz would already be, that's a
factor of four higher.
At that point, the jet is completely translucent,
and you really are just looking right at the
horizon.
Joe Paradiso: The far side of the moon [inaudible
01:01:13] would be a great place to put an
array for you guys.
Although it's gonna be a monthly, it's gonna
steer every month instead of every day.
Michael Johnson: The problem is the time scale.
You're getting comfortable with the timescale
of M87, so you're not getting all of this
dense sampling [inaudible 01:01:27] enough
time to make an image.
Vincent Fish: Yeah.
My view on that is that, you can make arguments
for things out to about geosynchronous or
a little bit beyond.
So, satellites and low Earth orbit whip around
so fast that you actually fill in a lot of
those missing space things.
And so that's really great for dynamic range
and maybe essential to make really good movies
of Sagittarius A-star.
Vincent Fish: When you go out to geosynchronous,
now you're getting tracks in the UV plane
once a day, which is commensurate with the
timescale variability of the M87, and actually
of other quasar sources, AGN sources, that
we observe with the EHT.
But if you go too much beyond it, then you're
on that month timescale.
And your angular resolution is so fine that
everything is moving, every source is moving
or changing on that angular scale, much faster
than a timescale.
So, I think there is a practical limit to
that, unless you put up a lot of satellites.
Joe Paradiso: Yeah.
Okay.
More questions.
Speaker 8: So often, what we consider to be
noise in one era, turns into being signal
in another.
And I'm wondering, is there interesting noise
that's on the cutting room floor right now,
that you'd really like to spend some time
with?
Lindy Blackburn: So, we're certainly not trying
to throw any systematics from our source.
Although it's not crazy, people do try to,
for example, average over variability and
Sagi A-star.
Not on the current data, but people have proposed
ways to help stabilize the data, to make images
at all.
But as far as a systematic remodeling in the
data, now there are things like the atmosphere.
So, if you're interested in that and we are
interested, 'cause we're radio astronomers
and modeling the atmosphere will pull out
the signals.
We'll learn something about weather of the
sites and how to characterize it.
Joe Paradiso: Any more questions?
You guys are quiet.
Okay, we have Juliana in the foreground.
Juliana: Building on the discussion about
space mission architectures, are you familiar
with the [inaudible 01:03:40] radio distributed
antenna array, out of the Netherlands?
It's a [Pico 01:03:47] satellite radio antenna
array on orbit around the moon and-
Michael Johnson: It was launched just a year
or two ago, is this one?
Juliana: It may still be under development
but [crosstalk 01:03:56].
Michael Johnson: Okay.
Did you know the wavelength that it's [inaudible
01:03:59]?
Juliana: I don't offhand.
I know that the intention is to measure early
phenomenon of the universe.
Michael Johnson: I see.
I think those tend to be at longer wavelengths.
There's a trade off here, we want to go to
high frequencies and dip higher resolution.
But in doing so, you need much higher surface
accuracy for your dishes.
You got to make the dishes smaller.
And the receivers are not, they have more
noise.
And so the whole problem overall is that,
there's some sweet spot beside that, just
pushing to the highest frequency you can,
and still having enough sensitivity to see
anything.
So, this is why these current space VLBI missions,
there's still about a factor of 10 too long
in wavelength, to be able to peer down to
these scales.
But I think yeah, absolutely.
We want to piggyback with these other efforts
and get something off the ground quickly.
Joe Paradiso: What antenna size actually would
you want for a space mission?
Now, these are big antennas that you're using
traditionally.
So, could that scale to a smaller one and
still be effective?
A millimeter wave technically, it's not so
much of an issue.
But to collect enough radiance [inaudible
01:05:06].
Vincent Fish: Yeah.
The bigger the better.
But big also makes it very difficult to create
in the first place, very expensive on a per
element basis.
Very small.
On the other hand, you don't have the sensitivity.
And if you were to consider like phase raise
or something like that, then you would need
a receiver per element to the array, and so
that becomes cumbersome and expensive as well.
Vincent Fish: I think if you're willing to
leverage the big apertures on the ground,
not just ALMA but LMT, phase SMA, phase [Noema
01:05:40] in France.
You'd be happy if you could get a satellite
up there that's three meters four meters [crosstalk
01:05:47], somewhere in that range.
Michael Johnson: They can be really small
and still be tremendously effective.
Joe Paradiso: Good.
So it's 2:10.
If there's no other burning question, I think
we can wrap this up.
And thank these folks for coming down.
And [crosstalk 01:06:04] collaborators on
this great achievement.
