>> 
ROSE: I'd like to thank Hartmut. I wasn't
on the speaker list until last night at about
10:00 P.M. So I've just put together a very
short brief introduction to what we're doing
and the--some of the philosophy behind the
approach that we've taken. I'd also like to
thank him for that introduction this morning.
I think it was, in some ways, very moving.
You know, I think that it's--you know, especially
if coming from a Condensed Matter Theory side.
You see quantum mechanics everywhere, and
in the world at large and especially in Biology,
people ignore it and I've never understood
why. I mean, even this segregation between
the seemingly trivial and the things that
are a higher level expressions of quantum
mechanics, I don't understand that division
myself. Because if you ask the question "What
about this thing could be the same if it was
classical?" the answer is nothing, because
electrons and matter as we know it doesn't
exist. It's not quantum mechanical. There's
some essence at the very bottom of all of
this that you require as quantum mechanics,
and I have never understood why people draw
dividing lines between behavior at the electronic
scale and behavior at the macro scale. Because
sure, they may be dividing lines and there
may be areas where a system maybe modeled
classically, but you can't pretend that the
thing is classical. It's not. Nothing is classical.
Under the hood, everything is quantum mechanical.
And so, you have to be open to the possibility
that the things that you care about as a scientist
or as a builder of things can be made better,
more efficient, or have to understand--you
have to understand how these things work in
order to make progress. So this theme is explicit
in the exercise of trying to build quantum
computers, because in some ways, quantum computers
are this idea made manifest. But you want
to somehow amplify the behavior of the world
from the electronic scale or the scale of
the very small and very cold up to a level
where large, hot, macroscopic things such
as humans can play with these things and make
them do things that we find interesting like
solve hard problems. So I'm going to talk
about one such effort which is the one that
we've been involved in for some time now.
So D-Wave is a--is a private company. It's
been around for about 11 years, and we only
do one thing, and that is build super conducting
processors. So these are processors that are
fabricated out of metals, in particular, a
metal called niobium, run at very cold temperatures.
Designed to embody a very specific quantum
algorithm, which is this quantum annealing
thing that Hartmut referred to earlier on.
So these things are not universal quantum
computers, for those of you who were familiar
with the field. They're not gate model quantum
computers which is a particular type of quantum
computer. They're another way of looking at
how you would use quantum mechanics in order
to compute something useful in the world.
It's a fairly small effort as these things
go, but we do have some unique infrastructure
including what's commonly believed to be the
most advanced super conducting chip fabrication
effort in the world, which is by Eric who
is sitting back there. So if you want to know
anything about superconducting chip fab, he's
arguably the world leading expert in all of
this. And that was his--something like the
crown jewel in our efforts. Our ability to
fabricate these things efficiently and quickly
has differentiated this effort from a lot
of others. So I'm just going to very briefly
talk about where we got to where we are over
the last 11 years, and there's really two
phases to this effort. And I think that this
model of how we did this applies to all sorts
of basic science commercialization exercises,
so I think it's relevant to this group. So
if you have something that is inherently involving
a huge basic science concept or concepts and
you want to bring it out into the world, this
is a path to doing that that has been very
effective for us and I think there are some
lessons to be learned for others. So the efforts
are differentiated in two phases. The first
phase was one where there were some conceptual
notions about how you proceed to build a quantum
computer back in 1999. And there were the
first vestiges of experimental data that suggested
that you could actually build components of
these things in the lab that had something
like the properties that you wanted. But back
then, this was like--imagine the task was
to map the entire U.S. The area that had been
mapped was this tiny little hundred square
meter area right next to where the ship landed.
So there were some things known about a very
tiny amount of the types of things that you
need to build one of these systems. So being
young and foolish, me and a bunch of other
people in the early days felt that it was
a worthwhile endeavor to try to map more of
this territory. So we wanted to understand
what was known and what was doable, not just
in principle but in practice--it's a very
important point I'm going to come back to--about
all of the different areas that you need in
order to really build a computer. So quantum
computers are computers, and the way that
I look at them is they're more computer than
quantum, and that's not the way that most
people in quantum computing view this field.
The quantum for them is a capital Q, and the
computer is a little C. For me, it's the other
way around. The most important part about
building a computer is building a computer,
and this is--this is not just semantics. It
flavors the entire philosophy behind this
project. So we entered into a bunch of research
agreements with universities and research
institutes around the world with the intent
of trying to answer some of these questions
about what was doable in principle or in practice.
So that went on for a while until it became
clear that the original vague conception of
how this would all turn out which was that
someone would try to build a quantum computer
for real and that we would somehow be involved
or help this effort along. That was just not
going to happen. No one really seriously tried
to build one of these machines, and this still
happens. Being--just being realistic about
the amount of effort that's required in order
to really build a computing system. I mean,
imagine even a conventional silicon processor
like the cell chip. It required billions of
dollars of investment and in some ways, that's
just an incremental step over what was already
known. It was a so-called engineering project.
There's nothing like that in quantum computing.
Even the effort that we put into this which
is in the sort of $100 million range is nowhere
near the effort that even a single semi--new
semiconductor chip gets. So the scale of effort
that's required to do this is not being invested
by anyone, for good reasons. But anyway, so
what we wanted to do was to try to implement
an idea because we didn't see any value coming
from this if somebody didn't do it. So we
figured, why wouldn't we? Again, we were young
and foolish back then. So what we decided
to do was use the knowledge that we had gained
to try to--and understand was it actually
possible, given the limited resources that
we new that we'd have, to try to build a quantum
computer? And the idea was to focus on architectures
that were extremely tolerant to the types
of faults that you'd get in the real world.
So when you build a semiconductor chip or
a super conducting integrated circuit, you
have to deal with the real world consequence
that not all of your devices are going to
be the same; that there are going to be variations
across a wafer or across a chip; not every
two qubits is going to be identical. You're
going to have to deal with the issues that
there's going to be defects in all of this.
So how is your system going to work in the
presence of all that realistic stuff? So the
thing that we've ended up building is not
the be-all and end-all of quantum computing.
In fact, I think it's, in some ways, just
a very early faltering first step. But this
effort and things like this required to get
to this future that everybody in this field
dreams will someday happen, which is that
we have control over nature at such a fine
level that we can start exploiting these things
that we can see and we can take a microscope
and you can look at things and you know that
quantum mechanics is a very good model of
nature at these levels. But how do you actually
use that? And so this is a very first early
step on the way and, in some ways, parallels
some of the stuff that we heard this morning
about photosynthesis; is that it's just beginning
to dawn in the collection--to collective conscious
that there are things that you can do here
for real, practically within time skills that
we care about, and it's very exciting time
to be involved in the field. All right, so
what I'm going to do is show you what one
of these things look like. So the implementation
that we use requires cryogenic temperatures,
and that's millikelvin. So we use machines
that are called pulse tube dilution refrigerators.
So these are machines that allow you to cool
a very substantial block of stuff down to
close to absolute zero, in our case, somewhere
around 10 millikelvin. The inside of this
metal cylinder contains an insert which looks
like this where these circular plates exist
at different--colder and colder temperatures
from the top being room temperature down to
the bottom being the--near the base temperature
of the fridge. And at the bottom of this,
you see an assembly which is a bunch of filters
and shielding elements. And the chip itself
resides right down at the bottom. Now for
us, there are two very difficult to meet environmental
constraints. One is the--one is the temperature.
The other is that the processor type that
we build is very sensitive to magnetic fields
and they can't be anywhere near the chip.
So the hard--people get amazed at the fact
that you can attempt to build a computer that
runs at millikelvin, but that's really not
the hard part here. It's much harder to make
these low magnetic fields than it is to make
the low temperatures, for a variety of reasons.
So we've done a lot of work on magnetic shielding.
That thing at the bottom there, if you look
a little closer, has something that's very
much like a motherboard upon which is mounted
the actual chip itself. Remember, these are
super conducting chips, so they are--they
are very much like your standard CMOS-based
device, except that now we're talking about
metals instead of semiconductors and it exists
right at the middle there. So this--let's
see if I can bring--this thing here is one
of the chips. This is an optical photo of
a portion of one of the actual processors,
and I'm going to explain a little bit what
all of this is. But just to give you some
flavor, these cross hatchings are actually
loops of metal. So these things that look
like white lines are actually topologically
loops. And these lines are the so-called qubits
in this type of design. So this--every time
you see one of these horizontal or vertical
white lines that look like lines, they're
actually loops of metal and these are qubits
in this design. They're so-called super conducting
flux qubits. The way that they encode information
is in the direction of a current which flows
around this loop. So if you imagine counterclockwise
rotation of this current around this loop
corresponding to binary state zero and the
other direction corresponding to binary state
one, these things at low temperatures are
binary objects. So they can only have one
of those two states and those are the states
that encode the information that we care about.
So, at a higher level, the way that the system
works is that all of this low temperature
physics gadgetry is connected to the world
and you can access it remotely. So all of
the stuff that the experimental guys back
to the lab can do to the system, you can do
from here. If you could resolve the projector,
I could show you. So we're actually sitting
on one of the leaves now. And the way that
thing is architected, it's allowed--it can
handle multiple users at the same time doing
simultaneous experiments. So about the processor,
I'm going to--I'm going to be a little bit
philosophical here. So the dominant paradigm
in quantum computing is a model that arose
from the computer science and mathematics
communities largely, and it is a beautiful
model which is universal for quantum computation,
but it has one significant flaw. It makes
assumptions about the world that are not what
you see in nature. So, the natural world has
a certain way about it. So when you go into
your lab and you have--and I have a condensed
matter physics background and I think a lot
of my views of this is flavored by this. So
condensed matter physicists study real systems
in the lab, bulk magnets and things of this
sort where it's simply not natural to think
about the thing that you care about in isolation
from everything else, because almost nothing
in the world is like that. All systems in
nature have a thing that you care about and
they have an environment, and the environment
significantly impacts the behavior of the
central system. So the gate model conceptually
relies as a first pass on the absence of that
environmental coupling. Now there are methods
that have been developed that work in principle
that allow you to remove the effects of environments.
But I've always been of the opinion that they're
not feasible in practice, so I'm going to--I'm
going to be a little bit controversial about
this. And I don't think that this particular
model will ever be built. And that's not the
common theme that runs through the quantum
information community. This has been a model
that people focused on for a long time now.
But I think that this model ignores some basic
facts about the way the world works that you
cannot, in practice, overcome. Yes, so I've
listed some of them and any--and if the experts
in the audience who want to debate me about
this point. Okay, so back then, I had convinced
myself that this gate model was not a good
thing to try to do. And so we were faced with
a decision, and being ornery, we decided to
do number two. And the inspiration for what
we ended up doing came from the same group
that the first speaker referenced. It's based
on ideas arising from the field of quantum
magnetism. So, quantum magnetism is a study
of magnets usually at low temperatures where
the behavior of the system somehow exhibits
quantum effects. And this paper was very influential
to me both as a graduate student and also
in my professional life at D-Wave. And I--it's
got a lot of text on it but I included it
all because it's one of my favorite passages
from any science paper that I've ever read.
And the stuff in the blue is really the summary
nugget of what we do. So, the idea is that
you want to build a quantum magnet. So, usually
in nature, quantum magnets are given to you.
You have crystal of something that has yttrium
and holmium and a bunch of stuff in it. It
just is what it is. It's not programmable.
You can go in and you can measure things but
it's not all that useful as a computer because
it kind of does one computation that mimics
itself. You know, we can't program it. So
what we wanted to do was take the exact same
paradigm but make it programmable. We wanted
to be able to settle the different parameters
in this thing but otherwise make it virtually
identical to a quantum magnet. So, I don't
have time to talk about what may be the most
interesting part of this which is how quantum
effects are used. What I'm going to do instead
is I'm just going to talk about what the thing
is suppose to do and not how it does it. So
what the system does is it allows you to define
an optimization problem. So, this thing on
the left, this Xi is an optimization objective
function which defined over binary variables.
So, these Ss are plus one, minus one variables,
or you can think of them as zero ones. This
expression means that for any particular setting
of those Ss, I get a number. The Hs and the
Js are the program for the machine. So as
a user of this machine, I specify what those
are and I set them to be real numbers between
minus one and one, all of them. So the Hs
and Js are just numbers. They are the program.
They come from some application or some idea
about how I want to use the system. It's a
separate issue. I give them to the system
and now the system's job is to return me a
configuration of all these Ss, so settings
of switches. And if everything works just
so, that setting of switches will be the minimum
number that you can get. So, there is a setting
of these switches that minimize the subjective
function. Yes, so that's what we're looking
for. Almost. I'm going to clarify that a little
bit. Just to kind of attach this to the physics
or the physical picture of the hardware, so
recall I mentioned that each of these little
white guys is a qubit. So the way that this
architecture works, the--this thing is a unit
cell. It's a repeating block that tiles the
plain. So I'm showing only part of a circuit
here but you can imagine this would stretch
out for a long distance over a chip. This--say,
I mean, this qubit. This cubit has physical
connections to all of its intersecting dice,
so I can couple this variable to this one
and so on. It also has connections between
and its cells via these guys. So, these yellow
dots that I've shown are areas where you can
program this pair-wise interaction. So, you
can--I can, as a user, look at every single
intersection and every single joint there
and put a number on that which is this quadratic
thing. I can also put a number that's local
to every variable, that H thing, on each of
the guys. So my job as a programmer is to
figure out how to do that in order to make
something make sense. Now just a little bit
of a segway or a transition here. What you
actually get in nature is not the exact right
answer every time, and this is related to
the fact that these things are real, condensed
matter systems interacting with a bath at
some temperature. So what you actually get
or what you would expect to get as a realist
is a distribution over these answers. So if
you go back to Hartmut's original picture
about quantum annealing, any physical system
that is defined by this energy landscape--this
optimization function landscape has many,
many local minima and a bunch of junk in it.
What you're going to expect to get from this
is a sampling from all of those energy states.
You don't get the exact answer every time
and for some hard problems, you're not going
to get it at all. So this is an NP-hard problem.
It's likely that quantum computers can't solve
NP-hard problems efficiently. It doesn't matter
because it's not the way we're using this.
We're using it to generate samples from some
probability distribution. In a limit of zero
temperature and everything being great, you
get the ground state of this objective function
which is the exact answer. But you don't really
care about that happening all the time; all
you want are good samples fast. So at a high
level, this is actually a very simple thing.
It's much simpler than the solar computer.
It's just a lot different. The chip is a generator
of probability distributions that you can
tune as a user. So those of you who are involved
in machine learning or in pricing derivatives
or in anything that requires Monte Carlo where
what you're doing is basically trying to draw
lots and lots of samples from a predefined
probability distribution, that's what this
thing does. The way that it works is you first
program it by setting these guys. Then you
have to wait because that generates some heat.
So you let the chip cool down then you do
one of these quantum annealing algorithms
that I haven't talked about at all. Then you
read the thing out, so every qubit is right
out at the end. That gives you a bit string,
these Ss, and then you repeat lots and lots
of times. And every time you repeat, you're
drawing a sample from that probability distribution
that you've defined. So, just to give you
an idea about how well this works, the thing
is amazingly accurate. So the probability
distributions that this thing generates are
indistinguishable within the errors involved
in extracting basic parameters of these circuits
like inductances and capacitance from open
quantum systems models. So we heard a little
bit about non-Markovian density matrix approaches
this morning. We have to do the same thing
here when we model these systems. This is
an open quantum mechanical system. You make
a bunch of predictions when you measure these
and what you're predicting are these output
probability distributions. And what you see
map-track exactly to the experimental data
for--as the biggest systems we can simulate.
So that's great. So these things are working
quite well though. Just briefly, the focus
of the company is on machine learning applications
and again, if you wonder--want to know why,
you can ask me out. And I had hoped to be
able to show you this but apparently, we've
had--we are having some technical difficulties
with connecting to the projector. So, if you'd
like to see an actual machine in action afterwards,
come see me and I'll plug it in for you. All
right, so that's it. Just to summarize, we
have seven of these things working now. The
generations of chips that I can show you after
have a 128 of these little devices. They're
quite highly integrated as superconducting
circuits go. They're among the most highly
integrated superconducting circuits that have
ever been built, and the next generation is
going to be considerably more complex. Thanks.
