HONG FAN: Hello, everyone.
My name is Hong Fan.
I'm a program director at
MIT corporate relations.
On behalf of MIT
Industrial Liaison Program,
I welcome you to today's
webinar, Quantum Computing
Opportunities and Challenges.
Since its founding in 1948,
MIT Industrial Liaison Program
has served as a gateway
connecting worldwide leading
companies to MIT research
and the larger MIT innovation
ecosystem, including
MIT-connected startups.
If you are interested
in learning more
about the program, please visit
to our website at ilp.mit.edu.
Today, we are delighted to have
Professor Will Oliver, director
of the Center for Quantum
Engineering at MIT,
to be our faculty speaker.
He will start webinar with
the introduction to quantum
computing, as well as an
update on the latest research
activities at MIT in this field.
We will then invite speakers
from US government, Google,
Keysight Technologies,
and Zapata Computing
to share their
perspectives on the latest
development of quantum
technologies and applications.
In the last 45
minutes, other speakers
will join the panel discussion
and will address the questions
from the audience.
You are encouraged to
submit your questions
or vote on questions at any
time during the webinar,
actually starting right now.
So please use the Q&A
button on your screen
for submitting questions
or voting on questions.
Please only use chat to report
technical issues to our webinar
team.
Today's webinar will
be recorded and posted
on our website at ilp.mit.edu.
All right, let's just get
into this very exciting world
of quantum computing.
We are over to you.
WILL OLIVER: Great.
Thank you very much, Hong.
And welcome, everybody,
to this webinar.
I'm really glad you could take
time out of your busy schedules
to join us today.
My talk today is
on an introduction
to quantum computation.
And before I get
started, let me just
thank Hong, Michael,
Kayla, and the ILP team
for hosting this today.
Great.
What I'd like to do is
I'd like to get started
with a video that's from our
MIT xPRO Online Professional
Development course.
It runs about four minutes.
And it gives a very
brief introduction
to quantum computation.
And then I'll come back in
right after that's done.
[VIDEO PLAYBACK]
[MUSIC PLAYING]
- We hear about
quantum computers
nearly every day in the news
and in the popular press.
- That's what's exciting
about quantum computing,
and that's what we're--
- It's said that
quantum computers will
solve certain types
of problems, ones
of tremendous
importance to humankind,
and problems that,
today, are practically
prohibitive or maybe
even impossible to solve
with current computers.
We hear about pharmaceuticals
and drug discovery, gaining
a better understanding
of new materials,
like high-temperature
superconductors
and how they work, new
methods for machine learning,
artificial intelligence,
optimization problems,
financial services
and technology.
Quantum computers will
even challenge and change
the way we securely communicate.
It certainly sounds like a
fantastic and exciting future,
which leads us to a few
fundamental questions.
What exactly is a
quantum computer,
and what's it good for?
More importantly,
when will we have one?
- Quantum computers are not
just smaller, faster versions
of classical computers.
They're fundamentally different.
- Where in the digital
computer world,
a bit, which is one basic
element of combination,
is a zero or a
one, nothing else--
- --in a quantum computer, you
can have a quantum bit, qubit,
that's in a superposition
of zero and one.
- We can design it.
We can actually control it.
We're actually engineering and
manipulating quantum mechanics.
- So [INAUDIBLE] here is a
very interesting and nuanced
question.
And the answer will
therefore be kind of finicky.
- We've been saying quantum
computers are 10 years away.
We've been saying
that for decades.
- Depending on your
definition, we already
have quantum computers.
They're just small.
- In my lifetime, I believe.
And I won't [INAUDIBLE]
- This is not decades
away or 100 years away.
That quantum age has now come.
- Quantum computers
aren't simply faster,
smaller versions of the
conventional computers
we have today.
Nor are they another
incremental step
in the evolution of Moore's law.
Rather, quantum
computers represent
a new, fundamentally
different computing paradigm--
one that carries tremendous
advantage for certain types
of problems of importance.
- Quantum computing could
really transform industries
where there are significant
optimization problems.
You've got a lot of
discrete or binary decisions
to make and figure out, do you
do this first or that first?
This is where we shine.
- Another way to understand the
difference between classical
and quantum [INAUDIBLE] quantum
systems and quantum simulation.
- The quantum processor is
a suitable tool for modeling
other quantum systems.
- [INAUDIBLE]
molecules systems--
tons of systems that we
use-- material systems
work based on those quantum
mechanical properties.
- You kind of need a
quantum machine in order
to simulate quantum effects.
- When we can manipulate
individual molecules
and understand what's going on
in those molecules, how they
bond, then we'll be able to
have a really good handle
on generating new things,
novel materials that
might be very useful.
- Still, we're just at the very
beginning of quantum computing
development,
assembling and testing
the first prototype processors.
It's a bit like
being in the 1950s,
at the dawn of
transistor-based computing.
And just as integrated
circuits led to an information
processing revolution
last century,
driving economic growth
and productivity,
many people today believe
that quantum computing
will have a similar
impact in this century.
- Quantum computing
and quantum algorithms
present fundamentally new
programming and algorithm
design paradigms.
How do we fundamentally
unlock new ideas in computing?
We're still learning
a lot about how
to improve the
individual components,
as well as connect
them together.
- That's part of the fun.
That's where [INAUDIBLE]
to see, what can we
do to enable that increased
complexity and functionality
of these qubits?
- We're really here
at the very beginning.
And it's-- I just find
that tremendously exciting.
[END PLAYBACK]
WILL OLIVER: So it is
a very exciting time
for quantum information
science and technology.
And although today's
talk is primarily
about quantum computing,
let me just take a step back
for a moment and say that
quantum technologies comprise
the sensing of
quantum information,
its distribution over quantum
networks, and then of course
the processing of that
information with quantum
computers.
And if we had to
define it succinctly,
I would say that quantum
information science utilizes
a quantum mechanical
description of nature to sense,
communicate, and
compute information
in ways that are unobtainable
by a classical description
of nature itself.
And so, again, the talk
today will be primarily
about quantum computation.
But let me just highlight a
few areas in quantum sensing
and quantum communication
that the Center for Quantum
Engineering is also covering.
And this includes
precise precision--
precision positioning,
navigation, and timing;
remote sensing and detecting;
biomedical applications--
for example, magnetic
field sensing in the brain;
next generation GPS,
centimeter-scale accuracy;
and the distribution of
quantum entanglement and secure
communication.
And I just mention here
that MIT is very fortunate
to have a quantum communication
test bed with fiber that
runs from MIT Lincoln Laboratory
42 kilometers down to the MIT
campus, where we can test these
different quantum communication
protocols.
And so with that, we'll
get into quantum computing.
And before we get
started, I'd like
to start with a poll question,
which you should see here
in front of you.
And the question,
basically, is when
do you think quantum
computers will be commercially
available, useful, viable?
And that can mean different
things to different people.
But what do you think?
Will it be in three years?
Will it be within five years,
10 years, 20 years or more?
Who knows?
I'm unsure.
When do you think quantum
computers are going
to be commercially useful?
It's a tricky question, right?
Because even as Donna
Rosenberg said in the video
that we just saw,
that we actually
have quantum processors today.
And as we'll see, and
as we'll talk about
in the panel discussion,
there are small-scale quantum
processors available to us now.
But the question
is, when will they
become commercially viable?
And of course, they
generate revenue already.
But when will they
generate profit?
OK, and so it looks like
the poll results are in.
And we'll share these
with everybody at the end.
So I'm asked this
question quite often.
And the way that I typically
will address it is by--
you know, I can't see the future
any more than anyone else.
And so what I do is
I take a look back
at the history of classical
electronic computing.
And here's a timeline
that I'm showing here.
The vacuum tube was
invented in 1906.
And it was used for
radio transceivers
for a number of years.
But it was a full
40 years before we
had our first vacuum-tube-based
computer called the ENIAC.
And that was in 1946 at the
University of Pennsylvania.
Around this time, the
transistor was invented in 1947
at Bell Labs.
And although within
about 10 years,
MIT Lincoln Laboratory
had built the first fully
transistor-based
computer, called TX-0,
based on these transistors,
it's very different than what
we know of today.
I mean, these transistors
were soldered together,
used a magnetic core
memory, et cetera.
Now, around that time, the
first integrated circuits
were developed at
Texas Instruments.
And it was yet another
15-ish, 20 years or so
before we had the first
chips that we would recognize
today, built by Intel--
for example, the 4004,
with 2,000 transistors,
in the early '70s.
It would take another 25 years
to get to the Pentium Pro,
with millions of transistors,
and then another 20 years
to get to where we are today,
with multicore processors
and GPUs with billions and
billions of transistors.
So if we look back
at this timeline,
it's well over 100
years of development
to get from the first
elemental logic elements
to where we are today.
Now, we can compare
that or contrast it
with quantum computing, which
is still at its very beginning.
It's nascent.
In the early '80s,
Richard Feynman
proposed that if you want to
simulate a quantum system,
you had better use a
quantum system to do it,
because it's a hard problem.
There are many, many
degrees of freedom.
And theorists thought about
this for, again, a good 10 or 15
years before we started to see
the first quantum algorithms,
theoretically speaking-- the
first quantum algorithms,
like Shor's factorization
algorithm, Grover's algorithm
for search, quantum annealing
and adiabatic quantum
computing.
And from that point forward,
it was a good 10 or 15 years
to get to where we
are today, where
we have the first few qubit
processors in the cloud,
with 5, 16, 20 cubits.
And then, just last
year, the Google team
demonstrated quantum advantage
with a 53-qubit quantum
processor.
So what's the take-away message?
Well, the first thing
that I take away from this
is that quantum
computing is real.
And it's transitioning from
scientific laboratory curiosity
to technical reality.
That's happening right now.
I also take away that
what we all know,
that advancing from a
fundamental discovery
in a laboratory to a
useful machine takes time,
and it takes engineering.
And of course, you must
be in the game to play.
It may not be surprising that
many of the leading technology
companies today who are
manufacturing hardware
were in this from
the very beginning,
or they bought
companies that were
in it from the very beginning.
And of course, we use
these quantum computers
to run algorithms.
And I list here many
of the algorithms
that we know of today, or
at least the categories.
I'm not going to read
this whole chart.
But let me just illustrate
or articulate the columns.
We have the classical
time that it would
take a classical computer.
We have the quantum time, which
it would take on a quantum
computer at scale,
and then the speed up
and limitations related
to research today.
Now, if you look at
these algorithms,
the one thing that stands out
to me is that many of them
were either developed or
advanced by MIT faculty.
For example, of course,
Shor's factoring algorithm,
which is used for cryptanalysis,
and in particular,
the RSA cryptoscheme, was
developed by Peter Shor.
Quantum simulation,
which many believe
is going to be the killer
app for quantum computers,
whether to develop new chemistry
or new materials, again,
has a big influence
from researchers at MIT,
Troy Voorhis and Seth Lloyd.
Linear systems of
equations, of course,
are very important,
sampling solutions
to very, very large matrices--
Aram Harrow and Seth Lloyd--
optimization problems, and then,
of course, search problems.
So all of these are
very important problems.
I mean, if you think
about optimization,
optimization is ubiquitous.
And everybody's trying
to optimize something,
whether it's a
financial portfolio
or a technical
challenge, like trying
to orient the
satellites in space
with multiple users and
multiple vantage points.
Optimization is ubiquitous.
So there's a lot of promise
to quantum computing.
That's fantastic.
So what's the problem?
Why don't we have one today?
And it comes down to
two characteristic times
that I'd like to introduce.
One is the coherence time.
And the other is the gate time.
So the coherence
time-- you can think
of this as the quantum bit,
the fundamental logic element's
quantum mechanical lifetime--
how long its quantum-ness
remains viable.
And if we have a
qubit, we can put it
in a quantum state, which we
represent with this capital psi
letter.
And at time t equals 0, we
know exactly what it is.
But this qubit is
always interacting
with its environment.
And over time, this state
decays and eventually blurs
and is completely lost.
Now, the qubit didn't disappear.
The qubit is still there.
But the problem is
the state that it's
in after some time
period is unknown to me
as the algorithm designer.
And this lifetime due to
environmental disruption
is called the coherence time.
Now, the second time scale we
care about is the gate time.
And this is the time required
to perform a logic operation.
So quantum computers, just
like classical computers,
use logic gates to
implement quantum logic.
These are single-qubit
gates and two-qubit gates.
And with single- and two-qubit
gates, a handful of them,
one can implement
any quantum logic,
just like in a
classical computer,
if you have one-bit and
two-bit gates, a few of them,
you can implement
any Boolean logic.
So the time to implement
one of these gates
on a quantum computer
is called the gate time.
And as you might
imagine, then, there's
a figure of merit,
which is basically,
how many gates can I perform
within the coherence time
that I have?
And this figure of
merit is important
because it emphasizes something
that is a bit nuanced,
but is certainly
true, is that it's not
enough for a qubit to have
a very long coherence time.
What's really important
is, how many gates
can I perform within the
coherence time that I have?
And so with this,
what we can do is we
can look at many of the cubit
technologies or modalities that
are being investigated today.
And we can plot this
in the following way.
We have this figure
of merit on this axis,
which is the number
of gates or operations
I can perform within
the coherence time
before there's an error.
And on this axis is the gate
speed, how fast the gates are.
And just like with classical
computers, faster is better.
Now, on this axis
here is gate fidelity.
And we see that the gate
fidelity is just one to one
in the number of
operations before an error.
So if I have 100
operations before an error,
that's 99% percent fidelity.
1,000 operations before an
error, that's 99.9% fidelity,
et cetera.
That's how that's defined.
And we have a red line here.
And I just mentioned this, that
if you've heard of something
called quantum error
correction, which
is a way to add redundancy to
improve the overall performance
of the system, that's great.
But when you do that,
the individual elements
need to be at least so
good beyond some threshold,
such that when you add
more and more together,
things actually
get better rather
than things getting worse.
And this threshold is
roughly at the 99% level
for the most lenient error
correcting codes that we know.
And of course, best
performance, then,
is in the upper-right corner.
So with that setup, we can now
look at different technologies.
There are a number
of qubits which--
technologies that
have demonstrated
single-qubit gates.
And I just list them here--
for example phosphorus doping in
silicon or the silicon MOS dot.
And these technologies are
just recently demonstrating
two-qubit gates.
The remainder here, though, have
demonstrated the universal set
of gates needed
for quantum logic.
And these are single-
and two-qubit gates.
And you can see a number
of technologies, which I'll
highlight in the coming slides.
So one of the two most
advanced technologies
today is trapped ions.
And here are some researchers,
Ike Chuang and Rajeev
Ram at MIT Campus, as well
as John Chiaverini and Jeremy
Sage at MIT Lincoln Lab,
investigating trapped ions.
And the other is
superconducting qubits,
which is my own research area,
along with Kevin O'Brien, Terry
Orlando, and Jamie
Kerman at Lincoln Lab.
And if I had to say very
succinctly what are we
researching, trapped
ions are trying
to make their gates faster.
And they're also trying to
improve the two-qubit fidelity.
And superconducting qubits--
they're already fast enough,
and they're basically trying
to make higher-fidelity gates.
Now, speed still matters, right?
And you can think
of it this way,
is that if a
quantum algorithm is
exponentially faster than
a classical algorithm,
that's fantastic.
It no longer takes me
10,000 years to run a task.
It takes me much less time.
But still, there's a difference
between slow and fast
in the following sense,
which is that if it takes me
a day on my superconducting
qubit quantum computer, which
is running at maybe
100 megahertz,
1,000 times faster than a
trapped ion quantum computer,
all else being equal--
that's a very important caveat--
that a day on a superconducting
quantum computer
would be 1,000 days on a
trapped ion quantum computer.
And so the gate
speed still matters.
And that's why trapped
ions are trying
to increase the speed with which
they can perform their logic.
We're also investigating
other technologies
at MIT and Lincoln Laboratory.
So one is neutral atoms.
And here are
[? Vlad ?] [INAUDIBLE],,
Wolfgang Ketterle, and Martin
Zwierlein working on this area.
Also NV centers by Dirk
Englund, and Paola Cappellaro
and Danielle Braje
at the Lincoln Lab.
And my own group is starting a
silicon quantum dots research
as well.
So there are many candidate
technologies under development
to realize the promise
of quantum computing.
And although I've cast these
in the context of quantum
computing, many of
these technologies
are also used as quantum
sensors or as a means
to facilitate communication
of quantum communication.
So there's a lot of promise
for quantum computing.
And as a result, there's
tremendous investment going on
worldwide.
And we'll talk more about this
certainly in the panel session.
But what you can see here
is a number of efforts--
I won't read through them all--
being funded worldwide
in quantum computing
by nations, by states.
But what's really changed
in the last several years
is that companies have realized
the value and potential value
of quantum computing
to their bottom line.
And in fact, many
entrepreneurial startup
companies are realizing that
there are services that they
can generate and sell.
And so it's in this context
that we formed last year
the Center for
Quantum Engineering
as an initiative between
MIT Lincoln Laboratory
and the Research Laboratory
of Electronics at MIT.
That was the
instantiation of it.
But it is institute-wide.
And so let me just talk
briefly about the center.
You can find the center
website shown here at this URL.
Our mission statement is the
academic pursuit and practice
of quantum engineering
to accelerate
the practical applications
of quantum technologies.
And I want to emphasize that
"quantum" means science,
and "engineering," of
course, means engineering.
And so quantum engineering is
both science and engineering.
The science is not
over, but we need
to start engineering
systems of quantum cubits.
The objectives
here are manifold.
But let me just
mention four of them.
The first is to define
quantum engineering.
What is it?
What are the textbooks?
We have to write them.
What does the
curriculum look like?
We have to develop it.
And we use it to educate
tomorrow's quantum engineers
and create a quantum workforce
for this growing field.
And to do that,
we want to partner
with industry via
consortium model, which
I'll talk about in a moment.
And our general goals are
to advance quantum science
and engineering.
And so we have a
large membership.
This is a partial list.
I couldn't fit
everybody on one slide.
But as I mentioned, it
comprises membership
from the MIT campus,
many departments
across the institute.
You'll see physics, electrical
engineering, material science,
chemistry, chemical engineering,
mathematics, et cetera,
as well as membership
from Lincoln Laboratory.
And we have a number
of engagements
that we've initiated
over the past year.
It's continuing to grow.
One of the major ones
was where that video
came from at the beginning.
That's the MIT xPRO Professional
Development series of courses.
It was underwritten
and sponsored by IBM.
And there are two classes--
four courses, actually.
The Fundamentals of Quantum
Computing has two classes.
And then the Practical
Realities of Quantum Computing
is another two classes.
And you can learn more about
the professional development
courses here at this URL.
Ike Chuang and myself
were the faculty leads.
And we taught it, teach it,
with Peter Shor and Aram Harrow.
We've had more
than 2,000 learners
take it in the last 18 months.
And we're now offering it
multiple times per year.
And the focus is on people
who are already in industry
and want to pivot towards
quantum technologies.
The second engagement
is with the Laboratory
for Physical Sciences.
And this enabled us
to form the Center
for Quantum Engineering, the
CQE LPS Doc Bedard Fellowship
Program.
Doc Bedard was an NSA LPS
scientist for many, many years,
really spearheaded the use
of cryogenic electronics
for classical computation.
And through this
program, we have
now eight three-year
fellowships,
graduate fellowships, a few
sponsored research programs
that we administer
through the CQE,
and funds to develop
quantum curricula.
We also facilitate faculty
industry engagement.
I just give one
example here, which
I think Liz, one
of the panelists,
will talk about as well.
And that is, through a
very generous donation
from Keysight Technologies, we
are developing a 64-qubit test
bed system at MIT.
And Keysight Technologies
in particular is interested
in seeing their electronics
being used in state-of-the-art
experiments.
And of course, another
industrial engagement
is, in fact, the quantum science
and engineering consortium
industrial group.
So that group, of course,
is a membership program.
And as a member, the entry
fee is 150K per year.
And startups-- we
want to include them
as well in this ecosystem.
So we charge them
only 10K per year,
because money is quite
valuable to a startup.
But time is also
valuable to them.
And so we charge them time.
They have to show up.
I won't get into
all the details.
There are options-- if you're an
existing MIG member or MIT.nano
member, there are options of
how to join at a reduced cost.
But what's important is
the value proposition
for doing this.
One is access to faculty
and research at MIT.
Another is engagement with the
fantastic students and postdocs
that we have, which
then leads downstream
to recruitment opportunities.
We host a number
of special events,
workshops throughout
the year, as well as
the ability to network
with other members
in the consortium.
We provide discounts to these
professional development
programs.
And perhaps most popular--
I guess we'll see-- is the
ability to customize membership
fee dues.
So of that 150K that you
might spend as a due, 100K,
you can customize.
You can send that to a
faculty member, a laboratory.
You get to choose
how that 100K is
spent within the auspices
of quantum science
and engineering.
Another 25K, we hold and
use for faculty startups
or infrastructure.
And then 25K is used for the
operations costs of the center.
And what that means is
that 83% of your fees
are going directly to supporting
research activities at MIT.
And I'll highlight that these
funds are, in fact, a gift,
which makes them discretionary.
And at MIT, that means
they don't incur overhead.
And so these funds go
even further than, say,
a sponsored research agreement.
So that leads me to the
second poll question, which
I would like your input on.
And that's basically, with
the Center for Quantum Science
and Engineering Consortium, the
benefits or value proposition
that I just spoke
about, which of them
would you or your company find
most valuable or attractive?
And in this one, you can
select all that apply.
And again, the
options are access
to faculty and research--
this means that you can
call up a faculty member .
If you've customized
your dues to this person,
they're very likely
to pick up the phone
and have a conversation, right?
Talk about your
interests in quantum.
You also get early access to
the research that's going on.
Of course, there's also
engagements with students.
And that leads to
recruiting downstream.
MIT students, as you know,
are just simply fantastic.
Special events and workshops,
the ability to network--
in addition to
companies and startups
that will be attending these
workshops and annual events,
we also invite members
of the US government
who are sponsoring
research in this area.
Discounts on professional
development programs,
as I've mentioned.
And also, there's
opportunities, if you're
in multiple consortia, that we
can provide discounts there.
And then, again,
the customization
of the membership
fee, that 83% is
going to go directly to
research here at MIT.
So I'd like to hear from
your perspectives which
of these are the most valuable
or the most attractive.
OK, and that has come in.
And it looks like access
to faculty and research
as well as the special
events and students are high.
Good.
Thank you for that.
All right.
OK.
So now, moving on.
So I want to emphasize
that MIT, the institute,
is not just MIT campus.
I've mentioned Lincoln
Lab a couple times.
And I want to emphasize
here that Lincoln Lab is
MIT's national laboratory.
It's a Department of
Defense national lab.
It's a fantastic place.
It's where I started my career.
And Lincoln Lab has
well over 100 people
who are working on
quantum information
science and technology,
from sensors,
to quantum communication, to
quantum computing as well.
And there's many ways
in which we interact .
And part of the
center's mission is
to make this interaction
easier and facilitate it.
And I just bring up
one example here.
In terms of fabrication--
and Lincoln Laboratory has
a fantastic fab facility.
It's a DOD trusted foundry,
ISO 9001 certified.
And this is a place
where we can do
high-yield,
reproducible fabrication
processes at a larger
scale, building test beds.
This fabrication line is a
200-millimeter fabrication
line.
And as you know,
just recently, MIT
invested more than
400 million dollars
in what's called MIT.nano,
which is a companion
facility at the MIT campus.
And this is really fantastic
for quantum information,
because this facility is
more than two times larger
than any other US
academic facility.
And this is where
we can do novel
rapid prototyping, exploratory
research on smaller wafers.
And we will have, in fact,
200-millimeter wafer tools.
We're installing some this year.
And so the interaction
between nano
and Lincoln
Laboratory's ML is going
to be very strong
and complementary.
And this is just
one example of where
we can impact materials
research, fabrication
engineering, and the like.
So with that, let
me conclude and just
say what I think
quantum engineering is.
And really, it's an
institute-wide effort.
If we want to build
future quantum systems,
we first need to
build test beds.
And to do that,
we need to bridge
the science and mathematics side
of MIT, the School of Science--
physics, math,
computer science--
with the classical engineering--
the School of Engineering,
such as analog and digital
circuits, control, DSP,
materials, and fab.
And both of these sides
need to pivot to quantum.
And so I view
quantum engineering
as the bridge, which is
connecting science, math,
and classical engineering.
Let me just briefly
mention that we've
selected a number
of panelists today
to partake in our discussion.
And I really appreciate
their attending today,
both from government and
from industry, so Dr. Charlie
Tahan, Dr. Erik Lucero, Ms.
Liz Ruetsch, Dr. Christopher
Savoie.
And I'll introduce them
one by one as they come on.
But I want to mention
that Charlie's
with the US government, OSTP
and the National Quantum
Coordination Office.
Erik is with Google
and represents a maker
of quantum computing systems.
Liz is with Keysight
Technologies,
making the electronics which
will go around quantum sensors
or quantum computers.
And Christopher is with
Zapata, which is building the--
developing the algorithms
that we're going to use
and is representing
the startup community.
So with that, let me summarize.
Quantum sensing,
quantum communication,
and quantum computing is
transitioning to reality.
That's happening today.
And as a result, we've
instantiated the Center
for Quantum Engineering.
It's enabling this transition--
research, education,
workforce development,
outreach, and more, but also
engagement with industry.
And that led us to the
Industrial Consortium,
the QSEC.
And MIT xPRO was
our first online set
of courses for
professional development.
But we're now expanding
both to curricula at MIT,
and we're also sponsoring
recently the coding school,
which had more than 300 high
schoolers for a week-long
course--
in fact, taught by,
two graduate students.
And for more
information at any time,
please feel free to contact
me at this email right here.
And so with that, I'll
stop sharing my screen.
