Moore's Law is already showing
signs of exhaustion for some time.
The silicon, the main
component used in the
construction of the
processors we use daily,
is already reaching the physical limits
inherent in the structure of that material,
with the miniaturization of elements
within the chip, begins to lose its
electrical conductivity characteristics so
necessary for the functioning our CPUs.
The third generation of Intel Core
processors, code-named Ivy Bridge,
already has an impressively
small lithography
22 nm - just to
give you an idea,
this means that each
transistor was measuring
1/4 the size of the flu
virus, which has 80 nm.
Because of the reduced super
size, the company had to
build its chips with the new
3D transistors technology
to standardize the
behavior of electrons and
increase the energy
efficiency of the processors,
since this scale of magnitude would
not be possible to build such small
objects with planar transistor
technology that has been used for years.
Many solutions have been proposed to replace
silicon as the main component going
graphene quantum computing, and the second
has started to show very promising results.
Current processors may
be regarded as tiny
lamps where the "erased"
is 0 and "lit" is 1.
These little lights are
transistors and the
speed and amount that
they change their state
is what allows our computer to perform the
calculations with which we are accustomed.
In computing, "quantum bits"
or q-bits have not only
two states, but a multitude
of them between 0 and 1.
Just a slight variation to achieve a
state change, so most operations can be
performed at once so that computers
equipped with future quantum processors
have the potential to be
millions of times stronger
than the most modern
supercomputers we see today.
Although the idea of using quantum
physics in the construction
of electronic devices may
seem science fiction story,
this concept is a very similar shift
from vacuum tubes to transistors, a fact
that at the time was seen as strange by
most but turned out to be a pattern.
They have been developed computers
capable of running on only 16 qubits -
which may seem little, but
these machines have superior
performance to processors that
have currently available.
Encryption is another area that has a
lot to gain from quantum computing.
Virtually unbreakable algorithms
can be developed with
the help of quantum physics
concepts and in the future,
we may have machines with more
processing power than we can handle.
What do you think of quantum computing?
Do you believe it will be
the future of computing?
BEGINNING
In the twentieth century, humanity has
followed a moral, technological development,
which is reflected in several areas
of knowledge and activity sectors.
A fact that many people
are unaware of is that
the great technological
leap made by man in the
last century was based on
the two great intellectual
triumphs established
in the same period.
The two first scientific gifts that preceded
the technological breakthroughs that
have changed the man's lifestyle are
Quantum Mechanics and Computer Science.
If today we have ever faster
and more powerful computers,
electronic equipment that allows
adequate medical diagnostics,
among many other electronic devices
that have improved our quality of life,
we thank all the scientists
who in some way contributed
to the development of these
two areas of knowledge.
Amazingly, the great revolution mentioned
in the previous paragraph started
in a crisis that physics went through
in the late nineteenth century
because some natural
phenomena were not explained
by physical theories then
existing in this period.
In this scope, there is a
precise date for the birth
of quantum physics, it is
the day December 14, 1900,
when the scientist Max Planck explains the
radiation emission of the black body.
The explanation was prepared by
Planck based on the assumption
that the walls of the cavity
when radiation emitted packages,
which should be integer
multiples of a certain minimum
amount, that is, the radiation
should be quantized.
A few years later, in 1905,
the German physicist Albert
Einstein renowned elucidates
the Photoelectric Effect,
which was another phenomenon not
explained by the wave theory of light.
The photoelectric effect is
the emission of electrons
by a metal surface bombarded
by a beam of light.
Although many of us do not
know, this effect is very
familiar, because it is
present in our daily lives,
such as doors that open
and close automatically
have the operation based
on this phenomenon.
Einstein explained the photoelectric
effect admitting the hypothesis that
light consists of the concentrated
energy package, which now call photons,
and thus the phenomenon in question is easily
explained when we consider the collision
between the photons of the incident
radiation and the electrons of the metals.
It is important to note
that Einstein and Planck's
quantization work suggested
but did not explain why.
A similar problem arose
when the Danish physicist
Niels Bohr in 1913
introduced its atomic model,
which assumed that the electron
could move only in certain
orbits which do not emit
electromagnetic radiation.
The radiation was
emitted only when the
electron "jumped" from
one orbit to another.
With this model, Bohr solved the
atomic stability and explained
the discrete radiation
spectrum for a hydrogen atom,
but it was not clear why the electron can
not occupy intermediate positions in space.
Thus, through these examples, we see that
quantum theory developed by the first
quarter of the twentieth century had
fragile theoretical and conceptual bases
because the principles were
sparse and the statements were
created with the specific purpose
to meet a particular need.
In this scope, the physical resented
authentic postulates and general principles
of which could formulate a consistent,
efficient and comprehensive theory.
This desire of the
physical became a reality
with the advent of
quantum mechanics.
Quantum mechanics is
considered the most successful
scientific theory of the
history of science.
It is due to infallibility, so far, their
predictions are verified by experiments.
Quantum mechanics is the
physics theory used to treat
microscopic particles in the order
of atomic or molecular size,
because if we use Newton's laws to analyze
the behavior of particles in this
size range, we will reach the inconsistent
results with the experiments.
Thus, quantum mechanics is
the basis of Atomic Physics,
Nuclear Physics, Solid State
Physics and Modern Chemistry,
in the sense that it is not surprising that
a large number of appliances with added
technological value have their principles
grounded operation in Quantum mechanics.
To get an idea from the post-war,
about a third of the US gross
domestic product comes from the
application of quantum mechanics.
Thus, when we look, for example,
modern mobile phones and
televisions, and various
other electronic equipment,
we must remember that come from the
vast applicability of quantum theory.
And yet, some projections indicate that
from the second decade of this century,
most of the jobs in manufacturing in the
world will be connected to nanotechnology,
and to work on this scale is essential to
a sound knowledge of quantum mechanics.
We mentioned the two
great intellectual
triumphs of humanity in
the twentieth century.
However, until now we
treat only one of them.
So let quantum mechanics a little
aside to enforcers Computer Science.
In a way, we can say that computer
science was born with the
remarkable article of the English
mathematician Alan Turing in 1936.
In that work, Turing developed the
abstract notion of what we know
as the programmable computer, which
became known as Turing machine.
This essential Turing proposed apparatus
operates with logical sequences of
information units called bits (binary digit)
which can acquire the values "0" or "1".
Turing's idea was so
important to the development
of humanity that the
computers we use today,
from simple notebook we use at home to
the most powerful computer of a large
research center, surely these are a
physical realization of the machine Turing.
Thus, all information provided
to a computer is read,
processed and returned in
the form of bit sequences.
A fact that reveals the
power of the Turing machine
is a result of the so-called
Church-Turing thesis,
which states that the Turing machine,
although simple, is capable of
solving any computational problem
solvable by any other type of computer.
This argument implies that if there is a
problem that is insoluble to the Turing
machine, such a problem can not be
resolved by any other type of computer.
The Church-Turing thesis says that
we do not consider different devices
of Turing machine to find out if
a problem is computable or not,
but the theory does not say
anything about the time
needed to solve a problem
that can be computable.
It enabled computers to evolve
in speed, but without losing
the fundamental principle
of operation based on bits.
In this sense, as we shall
see in this work, there
will come a moment in the
evolution of computers
that the merger of the
two intellectual assets
mentioned in this
introduction is inevitable.
This great event will open the door
to what we call quantum computing
that could revolutionize the current
way humanity treats the information.
Thus, the aim of this is to present
Quantum Computing, given their use in
schools of primary education, as well
as in informing the public in general.
THE MOORE LAW
There is evidence that the first programmable
computer emerged in 1941, during
World War II, and its invention is attributed
to the German engineer Konrad Zuse.
The computer developed by Zuse was
named Z3 and was entirely mechanical
so that the information was
processed by the motion of gears.
In the same period, scientists
from Britain and the
United States also worked on
similar technology projects,
and thus developed two critical
computers for the evolution
of computing, respectively
called Colossus and ENIAC.
The significant development of computers
about the Z3 was the use of electronics,
which enabled a real increase in
processing speed because currents
and electrical charges can be
handled faster than the Z3 gear.
However, it was the discovery of
transistors enabled the development
of rustic computers as described
above to what we have today.
The fact is that the notebook I used to
write this text is about ten million
times faster than ENIAC and much lighter,
given that the ENIAC weighed 27 tons.
Even with relevant differences in processing
speed and weight, my notebook and
ENIAC use electric charges and currents
to store and process information.
If we analyze this evolution
of computers using
a more efficient parameter
for our purpose,
we had in 1950 were needed in
1019 atoms - that's right,
10 billion - to represent a
single bit of information.
Some projections indicate that only one
atom will constitute a bit in a few years.
These forecasts are based on
what we call Moore's Law.
Gordon Moore, the founder of US company Intel
microprocessors, noted, in the 1960s, the
number of atoms needed
to represent a bit is
reduced by half about every
one year and a half.
It is equivalent to saying
that the number of transistors
on an integrated circuit
doubles every 18 months.
As an example, we can mention the
Intel 8086 processor in 1978 had
29,000 transistors, while the Pentium IV,
launched in 2000, had 42 million.
The laptop processor that I use in my day
to day has nearly 1 billion transistors!
From the physical point of view, Moore's
Law imposes a natural limit to computers,
because it was reached the limit
of one bit per atom from the
moment, there would be more to
increase the bit density per chip.
However, when the atomic scale
is reached, the classic
paradigm of the Turing machine
is no longer valid, that is,
we must think of a computing model based
on the laws of quantum mechanics.
It is precisely at this point that
arises what we call Quantum Computation.
THE Q-BIT AND ITS PROPERTIES
Unfortunately, the computer I am using
to write this text is not quantum.
In it, the processing of information still
follows the classical physics paradigm.
And yet, in its a bit of it
can be simulated by a charged
or not a capacitor, the
magnetization of a hard drive,
or any other mechanism
that provides us only one
result at a time - on
or off ( "0" or " 1").
We can make a very simplistic
representation of a bit using a coin.
To this end, we represent the result 'face'
with '0' and the result 'crown' with '1'.
As we intuit, the coins are macroscopic
objects, governed by classical
physics, so just get one of the
results ( "0" or "1") in each move.
However, if the currencies
behave like microscopic
objects that obey the principles
of quantum mechanics,
we would have to face cheeks and
crown could be seen at the same
time, i.e., the toss of a coin, the
head result, and crown coexists.
This result is possible
thanks to the properties
of quantum mechanics
called superposition.
The superposition of states,
although it seems a bit strange, can
be understood through an analogy
known as Schroedinger's cat.
To explain the details of the
solutions of the fundamental
equation of quantum mechanics and
the principle of superposition,
Schroedinger proposed a thought
experiment in which a cat
uses that can supposedly be
dead or alive at the same time.
In this sense, that cat
Schroedinger alludes to this
curious quantum property,
as made explicit below.
Consider a cat stuck in a box where there
is a container with radioactive material
that has 50% chance to issue a radioactive
particle every hour, and a Geiger counter.
The Geiger counter is a device
used to detect radiation.
If the material is releasing radioactive
particles, the counter senses the
presence and drives a hammer, which
in turn breaks a bottle of poison.
Of course, when spending
an hour, it will
have occurred only one
of two possible cases:
the atom issued a radioactive
particle or not issued (the
probability that one or other
event occurs the same).
As a result of the interaction inside
the box, the cat is alive or dead.
But we can not know unless you open
the box to prove the hypothesis.
If we try to describe what happened
inside the box, using the laws
of quantum mechanics, we will come
to a very strange conclusion.
The cat would come outlined by an
incredibly complex wave function result of
the superposition of two states, combining
50% "live cat" and 50% "dead cat".
That is, by applying the quantum
formalism, the cat would
turn "live" and "dead";
corresponding to two same states!
The only way to find out what
"really" happened to the cat
will perform a measurement:
open the box and look inside.
In some cases, we locate the cat
alive and the other a dead cat.
It is why when performing the measurement,
the observer interacts with the
system and the changes, breaking the
superposition of the two states,
causing the system to be observed
in one of two possible states.
And that's a simplistic way to explain
what we call the wave function collapse,
which is a characteristic of the
measurement process in quantum mechanics.
And then, when a q-bit
is measured, the result
will always be '0' or
'1', probabilistically.
Note that the q-bits are mathematical
objects with certain specific
properties that can be implemented
as real physical objects.
Examples of physical systems that
can be used in quantum computers
to represent q-bits are the different
polarizations of a photon;
aligning a nuclear spin in
a uniform magnetic field;
the two states of an
electron orbiting an atom.
QUANTUM ALGORITHMS
Anyone who has some knowledge
about computers knows that to
develop a computer given task
is necessary to program it.
And before that, it is essential
to make a good algorithm.
Put just, and we can say
that an algorithm is a
set of procedures to
accomplish a certain task.
For example, we develop an
algorithm to change the flat tire
of a car, where we realized
that two steps are replaced,
the success of the task will be committed
(if the algorithm does not make clear
that the screws should be loosened before
getting up the car with the monkey).
Accordingly, a program
will be better as the
algorithm that is based
is more efficient.
Fortunately, our computer
scientists, systems
analysts, and other
related professionals
are very competent in
developing algorithms for
computers that we have
today so that numerous
programs facilitate our
lives and allow lay
computing, like me, use
this awesome machine.
However, with the advent of quantum
computing, these programs will be obsolete.
Or rather, not only these
programs but the theory
used to elaborate them
will become obsolete.
Thus, when this new revolution
occurs, the programs
should be constructed
from quantum algorithms.
It is precisely at this
point that appears a new
challenge, because,
with this new paradigm,
future developers should
know well how the
information should be treated
in quantum perspective
so that hold knowledge
of quantum mechanics no
longer be a restricted
privilege to physical.
Currently, there are already some
proposed quantum algorithms, which
somehow have a considerable advantage
over classical algorithms.
One of these quantum algorithms was
developed by Peter Shor in 1993.
When the proposed algorithm, Shor
worked at AT & T company and developing
research that pointed advantages of quantum
computers about the Turing machine.
In this scenario, Shor formulated
a quantum algorithm that allowed
decomposing a number with many
digits in its prime factors.
The key detail is that Shor's
algorithm performs this task in
many times smaller than the amount
spent by classical algorithms.
The problem of factorization is essential for
the current cryptographic systems so that
the proposition that any algorithms jeopardize
security system (banks, and governments)
from the time when the first
quantum computer starts.
Another quantum algorithm worth mentioning
was proposed by the Indian Lov Grover
in 1996 while working at Bell research
laboratories in the United States.
Grover proposed a search algorithm,
which, as its name suggests, performs the
task of searching a database, finding items
that have certain desired properties.
We are accustomed to
using some kinds of
systems like these when
we use the internet.
By way of illustration,
consider a task in
which classically would
need 10,000 searches;
thus, quantum would
require only Grover 100.
The algorithm may be successfully
applied in practical
problems of molecular biology
and genetic engineering.
Thus, by these two examples
of quantum algorithms,
we realized that quantum
computers may, in fact,
revolutionize the way we
treat the information being
necessary for it to new quantum
algorithms are developed.
It is a great challenge for
the future of computing.
NEURAL NETWORKS AND QUANTUM COMPUTING
An interesting application of quantum
computing concepts is neural networks.
The neural network model was constructed based
on the human brain because this renders
information in an entirely different way
than a conventional digital computer.
The brain can be
considered a highly
sophisticated computer
nonlinear and parallel,
which using its constituent neurons
called performs processing such as
perception and motor control, much
faster than any modern digital computer.
In his book, Haykin says that the human
brain at birth has a great structure and
the ability to develop their rules through
what we usually call "experience".
It will be accumulated over time, with
the most dramatic development occurs
during the first two years of life
and continues far beyond that stage.
Thus, a neuron "development" is synonymous
with a plastic brain plasticity
which allows the developing nervous
system adapts to the environment.
Mobility is essential
for the functioning of
neurons as information
processing units of the
human brain and also she
is to the formation of
neural networks built
with artificial neurons.
According to Haykin a neural network, in its
most general form, is a machine that is
designed to model how the brain performs a
particular task or function of interest;
the system is typically
built using electronics or
programming is simulated
by a digital computer.
To this author can define a neural network
as an adaptive machine as follows:
a neural network is a
processor massively parallel
distributed, consisting of
simple processing units
that have the natural
propensity for storing
experiential knowledge and
making it available for use.
It resembles the brain in two respects:
knowledge is acquired by the network from
its environment through a learning process.
Connection strengths between
neurons, known as synaptic
weights are used to store
the acquired knowledge.
The procedure returned
to the network learning
training process is called
learning algorithm,
whose function is to modify the
synaptic weights of the network
in an orderly way to achieve a
goal of the desired project.
It is also possible for a neural network to
amend its topology, which is motivated by
the fact that the human brain neurons can
die, and new synaptic connections can grow.
Neural networks are also found in the
literature with the names of neurocomputers,
connectionist networks, processors,
parallel distributed, among others.
PHYSICAL REALIZATION OF QUANTUM COMPUTER
The experimental quantum
computing is currently
facing a scenario very
similar to the traditional
computer found in the '30s
because it was not known
what would be the best
technology for computers.
' Similarly, several practical
alternatives are being tested to
simulate the qubits of quantum
computing, among which include:
quantum dots, nuclear magnetic
resonance in liquids,
ion trap, superconductors,
among other systems.
Thus, several prototypes of quantum
computers that use little more than a
q-bit decade have been successfully
tested in laboratories around the world.
These tests have
demonstrated the functioning
of quantum algorithms
so far discovered.
The great challenge is to increase the
number qbits in a controlled manner, and
certainly, relevant research on this topic
will be supported in nanotechnology.
In this way, even not knowing
what would be the best
technology for the development
of quantum computers,
we already know the four
basic requirements for the
experimental implementation
of quantum computing.
These requirements are as follows:
(1) the representation of qbits;
(2) controllable unit developments;
(3) Preparation of qbits initial states;
(4) as the final state of qbits.
Computing has gone through an intense
development in recent years.
The number of transistors on a
processor, according to Gordon
Earl Moore, scientist, would
double every 24 months or so.
These transistors are
fundamental devices for
the processor to perform
the calculations.
The forecast given by Moore became
known as Moore's Law and referred to
the ability of computers that theoretically
would double every two years.
Companies like Intel and AMD
continually release faster processors
ever, and for that, ever smaller
portions of materials are handled.
Some transistors began to be manufactured
with only a few molecules, so that
's hard to overcome its size, expanding
the capacity of electronic devices.
The laws of physics about
objects smaller than atoms are
different from what we have
learned, and we are accustomed.
In this subatomic world,
energy is lost or gained in
a quantized way in energy
packets called quantum.
Several packages are called quanta.
For example, a light quantum
is known as a photon, which
is also the smallest possible
portion of light energy.
Subatomic particles can also
take on several different states
simultaneously so that the viewer
determine your current state.
Used computers to classical physics
study work with two states, which
are represented by 0 or 1, and each
of these numbers is called a bit.
In a quantum computer,
the particles could take
the value 0.1, and even
both simultaneously.
These values are called qubits.
The use of a simple form of
qubits can exponentially
increase the processing
capability of a device.
You can have a basic literature
review on quantum computing,
where we saw that this promising
area of science proposes to
merge the ideas of quantum
mechanics and computer science.
We note that despite the paucity
projects to construct quantum computers,
showed many developments are possible and
applicable even in the classical computer.
We also realize that the
adoption of the quantum
paradigm in computing
it is a natural path
because it walks concomitant
with the reduction of electronic
devices in the computer, as
already provided Moore's Law.
Note that quantum computing is not,
as some may mistakenly think, one
more among many attempts replacement
of a depletable technology.
It is a new paradigm of computing that can
have profound consequences, not only for
technology, but also to information theory
to computer science, and science in general.
We imagine that just as computing began
in the last century brought numerous
applications that have contributed to the
development of humanity in various areas,
and quantum computing will also
provide applications that reach from
space-travel to medicine, increasing
thus the quality of life of people.
We hope therefore that the pedagogical way
in which this was drawn up and its language
easy access allows more and more people
know about this new field of science and,
who knows, some do not
develop an aptitude for
research on this vast
and fruitful subject.
