The history of computer science began long
before our modern discipline of computer science.
Developments in previous centuries alluded
to the discipline that we now know as computer
science.
This progression, from mechanical inventions
and mathematical theories towards modern computer
concepts and machines, led to the development
of a major academic field and the basis of
a massive worldwide industry.
== Prehistory ==
The earliest known tool for use in computation
was the abacus, developed in the period between
2700–2300 BCE in Sumer.
The Sumerians' abacus consisted of a table
of successive columns which delimited the
successive orders of magnitude of their sexagesimal
number system.
Its original style of usage was by lines drawn
in sand with pebbles . Abaci of a more modern
design are still used as calculation tools
today, such as the Chinese abacus.In the 5th
century BC in ancient India, the grammarian
Pāṇini formulated the grammar of Sanskrit
in 3959 rules known as the Ashtadhyayi which
was highly systematized and technical.
Panini used metarules, transformations and
recursions.The Antikythera mechanism is believed
to be an early mechanical analog computer.
It was designed to calculate astronomical
positions.
It was discovered in 1901 in the Antikythera
wreck off the Greek island of Antikythera,
between Kythera and Crete, and has been dated
to circa 100 BC.Mechanical analog computer
devices appeared again a thousand years later
in the medieval Islamic world and were developed
by Muslim astronomers, such as the mechanical
geared astrolabe by Abū Rayhān al-Bīrūnī,
and the torquetum by Jabir ibn Aflah.
According to Simon Singh, Muslim mathematicians
also made important advances in cryptography,
such as the development of cryptanalysis and
frequency analysis by Alkindus.
Programmable machines were also invented by
Muslim engineers, such as the automatic flute
player by the Banū Mūsā brothers, and Al-Jazari's
programmable humanoid automata and castle
clock, which is considered to be the first
programmable analog computer.
Technological artifacts of similar complexity
appeared in 14th century Europe, with mechanical
astronomical clocks.When John Napier discovered
logarithms for computational purposes in the
early 17th century, there followed a period
of considerable progress by inventors and
scientists in making calculating tools.
In 1623 Wilhelm Schickard designed a calculating
machine, but abandoned the project, when the
prototype he had started building was destroyed
by a fire in 1624 . Around 1640, Blaise Pascal,
a leading French mathematician, constructed
a mechanical adding device based on a design
described by Greek mathematician Hero of Alexandria.
Then in 1672 Gottfried Wilhelm Leibniz invented
the Stepped Reckoner which he completed in
1694.In 1837 Charles Babbage first described
his Analytical Engine which is accepted as
the first design for a modern computer.
The analytical engine had expandable memory,
an arithmetic unit, and logic processing capabilities
able to interpret a programming language with
loops and conditional branching.
Although never built, the design has been
studied extensively and is understood to be
Turing equivalent.
The analytical engine would have had a memory
capacity of less than 1 kilobyte of memory
and a clock speed of less than 10 Hertz .Considerable
advancement in mathematics and electronics
theory was required before the first modern
computers could be designed.
== Binary logic ==
In 1702, Gottfried Wilhelm Leibniz developed
logic in a formal, mathematical sense with
his writings on the binary numeral system.
In his system, the ones and zeros also represent
true and false values or on and off states.
But it took more than a century before George
Boole published his Boolean algebra in 1854
with a complete system that allowed computational
processes to be mathematically modeled .By
this time, the first mechanical devices driven
by a binary pattern had been invented.
The industrial revolution had driven forward
the mechanization of many tasks, and this
included weaving.
Punched cards controlled Joseph Marie Jacquard's
loom in 1801, where a hole punched in the
card indicated a binary one and an unpunched
spot indicated a binary zero.
Jacquard's loom was far from being a computer,
but it did illustrate that machines could
be driven by binary systems .
== 
Creation of the computer ==
Before the 1920s, computers (sometimes computors)
were human clerks that performed computations.
They were usually under the lead of a physicist.
Many thousands of computers were employed
in commerce, government, and research establishments.
Most of these computers were women.
Some performed astronomical calculations for
calendars, others ballistic tables for the
military.After the 1920s, the expression computing
machine referred to any machine that performed
the work of a human computer, especially those
in accordance with effective methods of the
Church-Turing thesis.
The thesis states that a mathematical method
is effective if it could be set out as a list
of instructions able to be followed by a human
clerk with paper and pencil, for as long as
necessary, and without ingenuity or insight.Machines
that computed with continuous values became
known as the analog kind.
They used machinery that represented continuous
numeric quantities, like the angle of a shaft
rotation or difference in electrical potential.Digital
machinery, in contrast to analog, were able
to render a state of a numeric value and store
each individual digit.
Digital machinery used difference engines
or relays before the invention of faster memory
devices.The phrase computing machine gradually
gave way, after the late 1940s, to just computer
as the onset of electronic digital machinery
became common.
These computers were able to perform the calculations
that were performed by the previous human
clerks.Since the values stored by digital
machines were not bound to physical properties
like analog devices, a logical computer, based
on digital equipment, was able to do anything
that could be described "purely mechanical."
The theoretical Turing Machine, created by
Alan Turing, is a hypothetical device theorized
in order to study the properties of such hardware.
== Emergence of a discipline ==
=== 
Charles Babbage and Ada Lovelace ===
Charles Babbage is often regarded as one of
the first pioneers of computing.
Beginning in the 1810s, Babbage had a vision
of mechanically computing numbers and tables.
Putting this into reality, Babbage designed
a calculator to compute numbers up to 8 decimal
points long.
Continuing with the success of this idea,
Babbage worked to develop a machine that could
compute numbers with up to 20 decimal places.
By the 1830s, Babbage had devised a plan to
develop a machine that could use punched cards
to perform arithmetical operations.
The machine would store numbers in memory
units, and there would be a form of sequential
control.
This means that one operation would be carried
out before another in such a way that the
machine would produce an answer and not fail.
This machine was to be known as the “Analytical
Engine”, which was the first true representation
of what is the modern computer.Ada Lovelace
(Augusta Ada Byron) is credited as the pioneer
of computer programming and is regarded as
a mathematical genius, a result of the mathematically
heavy tutoring regimen her mother assigned
to her as a young girl.
Lovelace began working with Charles Babbage
as an assistant while Babbage was working
on his “Analytical Engine”, the first
mechanical computer.
During her work with Babbage, Ada Lovelace
became the designer of the first computer
algorithm, which had the ability to compute
Bernoulli numbers.
Moreover, Lovelace’s work with Babbage resulted
in her prediction of future computers to not
only perform mathematical calculations, but
also manipulate symbols, mathematical or not.
While she was never able to see the results
of her work, as the “Analytical Engine”
was not created in her lifetime, her efforts
in later years, beginning in the 1840s, did
not go unnoticed.
=== Alan Turing and the Turing machine ===
The mathematical foundations of modern computer
science began to be laid by Kurt Gödel with
his incompleteness theorem (1931).
In this theorem, he showed that there were
limits to what could be proved and disproved
within a formal system.
This led to work by Gödel and others to define
and describe these formal systems, including
concepts such as mu-recursive functions and
lambda-definable functions.In 1936 Alan Turing
and Alonzo Church independently, and also
together, introduced the formalization of
an algorithm, with limits on what can be computed,
and a "purely mechanical" model for computing.
This became the Church–Turing thesis, a
hypothesis about the nature of mechanical
calculation devices, such as electronic computers.
The thesis claims that any calculation that
is possible can be performed by an algorithm
running on a computer, provided that sufficient
time and storage space are available.In 1936,
Alan Turing also published his seminal work
on the Turing machines, an abstract digital
computing machine which is now simply referred
to as the Universal Turing machine.
This machine invented the principle of the
modern computer and was the birthplace of
the stored program concept that almost all
modern day computers use.
These hypothetical machines were designed
to formally determine, mathematically, what
can be computed, taking into account limitations
on computing ability.
If a Turing machine can complete the task,
it is considered Turing computable or more
commonly, Turing complete.The Los Alamos physicist
Stanley Frankel, has described John von Neumann's
view of the fundamental importance of Turing's
1936 paper, in a letter:
I know that in or about 1943 or ‘44 von
Neumann was well aware of the fundamental
importance of Turing's paper of 1936…
Von Neumann introduced me to that paper and
at his urging I studied it with care.
Many people have acclaimed von Neumann as
the "father of the computer" (in a modern
sense of the term) but I am sure that he would
never have made that mistake himself.
He might well be called the midwife, perhaps,
but he firmly emphasized to me, and to others
I am sure, that the fundamental conception
is owing to Turing...
=== Akira Nakashima and switching circuit
theory ===
Up to and during the 1930s, electrical engineers
were able to build electronic circuits to
solve mathematical and logic problems, but
most did so in an ad hoc manner, lacking any
theoretical rigor.
This changed with NEC engineer Akira Nakashima's
switching circuit theory in the 1930s.
From 1934 to 1936, Nakashima published a series
of papers showing that the two-valued Boolean
algebra, which he discovered independently
(he was unaware of George Boole's work until
1938), can describe the operation of switching
circuits.
This concept, of utilizing the properties
of electrical switches to do logic, is the
basic concept that underlies all electronic
digital computers.
Switching circuit theory provided the mathematical
foundations and tools for digital system design
in almost all areas of modern technology.Nakashima's
work was later cited and elaborated on in
Claude Elwood Shannon's seminal 1937 master's
thesis "A Symbolic Analysis of Relay and Switching
Circuits".
While taking an undergraduate philosophy class,
Shannon had been exposed to Boole's work,
and recognized that it could be used to arrange
electromechanical relays (then used in telephone
routing switches) to solve logic problems.
His thesis became the foundation of practical
digital circuit design when it became widely
known among the electrical engineering community
during and after World War II.
=== Early computer hardware ===
The world's first electronic digital computer,
the Atanasoff–Berry computer, was built
on the Iowa State campus from 1939 through
1942 by John V. Atanasoff, a professor of
physics and mathematics, and Clifford Berry,
an engineering graduate student.
In 1941, Konrad Zuse developed the world's
first functional program-controlled computer,
the Z3.
In 1998, it was shown to be Turing-complete
in principle.
Zuse also developed the S2 computing machine,
considered the first process control computer.
He founded one of the earliest computer businesses
in 1941, producing the Z4, which became the
world's first commercial computer.
In 1946, he designed the first high-level
programming language, Plankalkül.In 1948,
the Manchester Baby was completed; it was
the world's first electronic digital computer
that ran programs stored in its memory, like
almost all modern computers.
The influence on Max Newman of Turing's seminal
1936 paper on the Turing Machines and of his
logico-mathematical contributions to the project,
were both crucial to the successful development
of the Baby.In 1950, Britain's National Physical
Laboratory completed Pilot ACE, a small scale
programmable computer, based on Turing's philosophy.
With an operating speed of 1 MHz, the Pilot
Model ACE was for some time the fastest computer
in the world.
Turing's design for ACE had much in common
with today's RISC architectures and it called
for a high-speed memory of roughly the same
capacity as an early Macintosh computer, which
was enormous by the standards of his day.
Had Turing's ACE been built as planned and
in full, it would have been in a different
league from the other early computers.
=== Shannon and information theory ===
Claude Shannon went on to found the field
of information theory with his 1948 paper
titled A Mathematical Theory of Communication,
which applied probability theory to the problem
of how to best encode the information a sender
wants to transmit.
This work is one of the theoretical foundations
for many areas of study, including data compression
and cryptography .
=== 
Wiener and cybernetics ===
From experiments with anti-aircraft systems
that interpreted radar images to detect enemy
planes, Norbert Wiener coined the term cybernetics
from the Greek word for "steersman."
He published "Cybernetics" in 1948, which
influenced artificial intelligence.
Wiener also compared computation, computing
machinery, memory devices, and other cognitive
similarities with his analysis of brain waves.The
first actual computer bug was a moth.
It was stuck in between the relays on the
Harvard Mark II.
While the invention of the term 'bug' is often
but erroneously attributed to Grace Hopper,
a future rear admiral in the U.S. Navy, who
supposedly logged the "bug" on September 9,
1945, most other accounts conflict at least
with these details.
According to these accounts, the actual date
was September 9, 1947 when operators filed
this 'incident' — along with the insect
and the notation "First actual case of bug
being found" (see software bug for details).
=== John von Neumann and the von Neumann architecture
===
In 1946, a model for computer architecture
was introduced and became known as Von Neumann
architecture.
Since 1950, the von Neumann model provided
uniformity in subsequent computer designs.
The von Neumann architecture was considered
innovative as it introduced an idea of allowing
machine instructions and data to share memory
space.
The von Neumann model is composed of three
major parts, the arithmetic logic unit (ALU),
the memory, and the instruction processing
unit (IPU).
In von Neumann machine design, the IPU passes
addresses to memory, and memory, in turn,
is routed either back to the IPU if an instruction
is being fetched or to the ALU if data is
being fetched.Von Neumann’s machine design
uses a RISC (Reduced instruction set computing)
architecture, which means the instruction
set uses a total of 21 instructions to perform
all tasks.
(This is in contrast to CISC, complex instruction
set computing, instruction sets which have
more instructions from which to choose.)
With von Neumann architecture, main memory
along with the accumulator (the register that
holds the result of logical operations) are
the two memories that are addressed.
Operations can be carried out as simple arithmetic
(these are performed by the ALU and include
addition, subtraction, multiplication and
division), conditional branches (these are
more commonly seen now as if statements or
while loops.
The branches serve as go to statements), and
logical moves between the different components
of the machine, i.e., a move from the accumulator
to memory or vice versa.
Von Neumann architecture accepts fractions
and instructions as data types.
Finally, as the von Neumann architecture is
a simple one, its register management is also
simple.
The architecture uses a set of seven registers
to manipulate and interpret fetched data and
instructions.
These registers include the "IR" (instruction
register), "IBR" (instruction buffer register),
"MQ" (multiplier quotient register), "MAR"
(memory address register), and "MDR" (memory
data register)."
The architecture also uses a program counter
("PC") to keep track of where in the program
the machine is.
== See also ==
Computer Museum
History of computing
History of computing hardware
History of software
List of computer term etymologies, the origins
of computer science words
List of prominent pioneers in computer science
Timeline of algorithms
History of personal computers
Women in computing
Timeline of women in computing
