Hi, thanks for tuning into Singularity
Prosperity. This video is the ninth in a
multi-part series discussing computing
and the second discussing non-classical
computing. In this video, we'll be
discussing what quantum computing is, how
it works and the impact it will have on
the field of computing. The foundation of
this paradigm shift in computing is the
quantum bit, qubit for short, as the unit
of measurement for quantum information.
While a classical binary digit, bit, can
only be either 0 or 1, a qubit can be both
0 and 1 due to superposition.
Superposition is a property of quantum
mechanics in which when not measuring a
system, the resultant can be a variety, more
specifically, a probability of two or
more states. However, when we measure the
system, a final state must be adhered to.
An example most people are familiar with
is, Schrodinger's cat, both alive and dead
in the box at the same time, until we
open it up. A more concrete example is
the famous double slit experiment which
showed the wave-particle duality of light.
When firing electrons through a sheet
with two slits, we'd expect that the
particle would go through one slit or
the other and produce light in-line with
the slit on the wall behind it, and this
is in fact what happens when we observe
the result. However, when we're not
observing, the electron produces light on
the wall representative of an
interference pattern, with an
interference pattern being the result
that would be seen if a wave, say of
water, was to go through slits, with the
constructive and destructive
interference producing the same exact
pattern as single electrons going
through. With electrons however, the
result on the wall is determined not by
interference but by a Bayesian probability
spread, the probability that we'd find
the electron at a specific point on the
wall, with higher probabilities in the
center and getting lower as we move
outwards. In fact, the electron actually
goes through both slits, and one slit, and
the other slit, and no slits...all at the
same time and produces this spread. Another
property of quantum mechanics is
entanglement, in which two or more
particles can have correlated final
states when measured. This meaning if one
particle is measured have an upward spin
for example, and if there is an other
particle entangled with a negative
correlation, then that second particle
would have a downward spin. This is what
Einstein referred to as, spooky action
at a distance, you can create an entangled
pair, move them across the universe and
they would still instantaneously receive
information about one another. For more
information on quantum mechanics, be sure
to check out other creators on this
platform such as, Frame of Essence. Moving
on, now that we have a basic
'understanding' of quantum properties, how
does this translate to quantum computers?
To represent a qubit multiple avenues
can be taken: the spin-up and spin-down
states of an electron, the spin states of
a nucleus in an atom and the
polarization state of a photon. Both bits
and qubits scale in the same way, 1
bit is equivalent to 2 potential
states, 2 to 4, 3 to 8 and
so on. However, with bits in classical
computers, all those potential output
states can only be computed one state at
a time,
serial operation. In quantum computers,
all states are effectively computed
together, true parallel operation. As a
side note, quantum bits are represented
using a bloch sphere. With 0 and 1
only having a z-axis value and all the
other superposition states with only an
x and y-axis value. N qubits translates
to 2^N parallel paths of execution,
to highlight how important this is in
terms of computing, watch this clip on
the power of exponentials that IBM in
fact played in the 1960s to highlight
the power of computing performance:
This is an old story, but it reminds us
of the surprises we can get when even a
small number like 2 is multiplied by
itself many times. King Sharam of India was
so pleased when his Grand Vizier
presented him with the game of
chess, that he asked him to name his
own reward.
The request was so modest, but the happy
King immediately complied. What the Grand
Vizier had asked was this, that one grain
of wheat be placed on the first square
of the chessboard, two grains on the
second square, four on the third, eight on
the fourth, 16 on the fifth square and so on.
Doubling the amount of wheat on each
succeeding square until all 64 squares
were accounted for. When the King's
steward had gotten to the 17th square
the table was well filled, by the 26th
square the chamber held considerable
wheat and a nervous King ordered the
steward to speed up the count.
When 42 squares were accounted for the
palace itself was swamped, now fit to be
tied King Sharam learns from the court
mathematician that had the process
continued, the wheat required would have
covered all India to a depth of over 50
feet. Incidentally, laying this many
grains of wheat end-to-end also does
something rather spectacular, they would
stretch from the Earth, beyond the Sun
past the orbits of the planets, far out
across the galaxy to the star Alpha
Centauri, four light-years away. They
would then stretch back to Earth, back to
Alpha Centauri and back to the Earth
again.
So, after seeing the scale of parallel
operations a quantum computer can do, how
do quantum computers compute? Step One)
Activate The Spread: The quantum bits
required for the calculation are
acquired and entangled. Visualizing this
in a bloch sphere, these entangled bits
are in an equal spread of the
superposition of all the 2^N
states. Step Two) Encode The Problem: The
problem is encoded onto the system via
quantum gates which we'll discuss later
in this video. These gates reorient the
qubits into new superpositions for all
the 2^N states by altering
their phase and amplitudes. Step Three)
Unleash The Power: The quantum computer
comes to a solution by using the
principles of interference to magnify
the amplitudes of the most probable
answers and shrink the improbable
answers. Some recursive problems will
require running through the steps again.
The final step draws parallels to the
double slit experiment we discussed
earlier, through interference patterns a
Bayesian probability spread is produced,
showing the likelihood of the most
probable solutions, just like the
probability spread showing where the
light would be most likely to shine.
There are problems a classical computer
simply can't solve, this is part of the P
versus NP problem. Simply put, P versus NP
is problems that could be solved in a
reasonable amount of time versus problems
that can never be solved or would take
too long to obtain a solution. One such
problem is factoring a number into
primes, this is called Shor's algorithm,
which is also the basis of modern
encryption. A classical computer would
take in the order of quadrillions of
years to solve an encryption problem
without a key, going through each
potential output sequentially. A quantum
computer could solve this in the span of
a few days or less due to parallel
computation. A more in-depth discussion
on quantum encryption and security is a
topic best left for a future video on
cyber security. Also as a side note, if
you want more information on the P
versus NP problem, be sure to check out
the best video on the topic by creator,
hackerdashery. Back on topic, another
huge problem set that quantum computers
can solve and will drastically impact
the world are optimization problems.
Classical computers can do optimization
problems up to a certain point, before a
combinatorial explosion occurs. This is the
point where the number of different
combinations that must be explored in a
given problem grows exponentially. Take
the optimal seating plan for 14
people at a banquet dinner for example.
With 2 people there is 2! 'factorial', in
other words, 2 combinations, 3
people is 6 combinations, 4 24, 5 120,
6 720, 7 5040. As you can see, the problem is
slowly reaching an exponential tipping
point, now going forward by another seven
people at 14 people there are over 87
billion different seating combinations.
This simplistic example serves well for
visualizing the scale optimization
complexity can reach, and how problems
while simple at first can get out of
reach for classical computers very fast.
Now a field of computer science that has
seen a lot of traction recently and aims
to solve optimization problems is
machine learning, further extending to
artificial intelligence. These algorithms
are able to solve problems previously
thought not possible by the P versus NP
problem. We'll cover this topic very
intensively in this channels AI series,
but essentially machine learning
algorithms solve problems by crawling
through large sets of data and finding
commonalities and correlations which
help it form its own optimal solution
rather than explicit programed code. Data
crawling, sorting and path optimization
are fields of computer science in
themselves, with algorithms 
designed to reduce the time required,
such as bubble sort, shear sort, Dijkstra 
and countless others. All these
algorithms are classical in nature and
even though some might implement
asynchronous techniques, they are still
serial, so a 1 million element list for
example is still sorted element by
element. Quantum computing algorithms as
discussed in the previous section will
be able to sort and optimize data much
faster through their parallel operation,
this translates to exponentially
increasing AI performance. From circuit
design, the shape of vehicles for optimal
drag performance, Google Maps, other
complex P versus NP problems such as
protein folding and simulating chemical
reactions, the list can go on and on.
Quantum computing algorithms and AI will
revolutionize nearly every field from
bio and nanotechnology to marketing to
ideas we can't even imagine possible
today - many videos on this channel will
be dedicated to covering these ideas in
the future. It is highly improbable we will
see quantum computers on a desktop
anytime soon, however, through the concept
of heterogeneous system architecture
which we discussed in a previous video
in this computing series, there will
still be ways we can get the benefits of
quantum performance. One such method will
be quantum computers in the cloud. You
access them with problems through your
normal devices such as a desktop, laptop
or mobile phone and quantum computers in
the cloud will reduce the probability
space, return the most probable answers
and your device will have enough computing
power to take it from there. Coming up
we'll cover some quantum computers we'll
see in the cloud in the near future and
some that are already there now!
[Music]
2018 is for quantum computing like 1968
was for our current classical computers:
computers are the size of entire rooms,
the cutting edge of all types of
research is pouring into them and more
organizations and people are entering
the race to quantum supremacy every year.
Quantum supremacy is the point at which
quantum computers will become more
powerful than classical computers, this
milestone is set at 50 qubits. There is a
difference however between the
methodology of quantum computing used to
get there, not all quantum computers are
made equal. Dwave for example uses a
type of quantum computer based on
quantum annealing, this allows them to
scale up much faster in the qubits used:
from 128 in 2011, 512 in 2013, 1000 in 2015,
2048 in 2017 and a 5,000 qubit system is
expected this year, 2018. Quantum
annealing however doesn't operate like a
typical quantum computer and relies on
energy minimization problems which
lowers the scope of problems that it can
solve. These problems are still in the NP
section, and referred to as QUBO, quadratic
unconstrained binary optimization,
problems. Essentially a pattern matching
technique which also has applications
that are useful in machine learning. It
is hard to quantify when quantum
annealing will reach a point of quantum
supremacy due to its problem scope,
however, this approach is the fastest to
scale and will bring public quantum
computing faster than gate based quantum
computing. The 50 qubit quantum supremacy
milestone is set for gate based quantum
computing, this is what we discussed
about earlier in the video, where all
qubits in the system are entangled and a
probability spread outputted. There are
various initiatives to achieve this, to
list a few: Intel with there 49 qubit
chip unveiled at CES 2018, IBM with
development of a 50 qubit chip
announced late 2017, Google who have built
a 50 qubit chip and are now testing and
Rigetti who have plans for a 50 qubit
chip by 2019. For more information on
current initiatives be sure to check out
other creators on his platform, as the
quantum race is ever-changing and
expanding. Now to see how complex quantum
computers are, check out this video of
IBM's quantum computer, Q: This is the
first IBM Q computation center, where the
commercial quantum systems used by the
IBM Q network live. The IBM Q network is
a worldwide organization of industrial,
research and academic institutions -
joining IBM to advance quantum computing
and launched the first commercial
applications.
Here we see a 20 qubits system which
will be accessed online by members of
the IBM Q network, in the future they
will have access to 50 qubit systems
which IBM recently prototyped. Listen to
the tinkling whoosh the system makes as
it maintains the ultra-cold, 15
millikelvin temperature, required for
IBM's superconducting qubits to operate.
That's colder than outer space, cold
enough to make atoms almost completely
motionless.
This is an open dilution refrigerator
that contains the qubits of niobium,
silicon and aluminum - it's so dark and
cold inside, that it's almost impossible
to find even one photon of light. The 20
qubit quantum computer you just saw is
available for public use through IBM
cloud services, and has a great community
of developers and people just venturing
into learning quantum algorithms with
many resources on what types of quantum
gates there are and their effect on
results. This system is global with over
60,000 users from more than 1,500
universities, 300 high schools and many
institutions - running over 2 million
experiments with over 35 research papers
and growing. In fact, the first quantum
video game has been created by one of
these users, Quantum Battleships, you can
hit miss and both at the same time ;)!
Microsoft also has a development
environment and extensive documentation
for simulating a quantum computer and
running quantum algorithms on your
computer at home. This is a fairly
computationally intensive process, with
simulation correlated to your RAM.
Simulating 30 qubits requires 16 gigabytes
of RAM, adding just one more qubit
doubles the amount of RAM needed, one
less halves the memory required -
extrapolating forward, simulating 40
qubits requires 16 terabytes of memory,
which is why there is also the ability
to run simulations off Microsoft's Azure
cloud! Commercial adoption of quantum
computing is still a ways off, but as
stated earlier, 2018 is the 1968 of quantum
computers, with inevitability that they
will be the basis for future computation.
This field of computing is still in its
infancy, but accelerating at an
increasingly exciting and rapid pace!
[Music]
At this point the video has come to a
conclusion, I'd like to thank you for
taking the time to watch it. If you
enjoyed it consider supporting me on
Patreon to keep this channel growing and
if you want me to elaborate on any of the 
topics discussed or have any topic
suggestions please leave them in the
comments below.
Consider subscribing for more content,
follow my Medium publication for
accompanying blogs and like my Facebook
page for more bite-sized chunks of
content. This has been an Ankur, you've
been watching Singularity Prosperity
and I'll see you again soon!
[Music]
