hi, this is Zaheer and I will be your instructor in
this course we are going to learn
fundamental concepts of computer science,
taught in computer science majors in
universities. In this lecture we will
begin by introducing what computer and
computer science is, and then we will
look at the development and history of
computer science keeping  in view the
history of tools that have led to the
development of modern computer first
thing I am going to emphasize here is
 
that learning to use a computer is not
same as learning computer science.
computer is just a device or tool to
solve problems or as you might have
heard the conventional definition that
computer is an electronic device which
stores, retrieves, and
processes data. whereas computer science is
the study of principles that computer
uses to solve our problems or we can say
that it is the study of computation
methods it uses to solve our problems
computer science and engineering
computing or informatics refer to the
same field or discipline of the study.
So do not be confused by the different
names of the subject.  Essentially
computer science is the science that
deals with the theory and methods of
processing information in digital
computers. he design of computer
hardware and software and the
applications of computer.
So What are
computer science the student should
learn in this course and any other
course Peter Denning suggests that
computer science practitioners must be
skilled in following areas. first one is
algorithmic thinking that is you must
learn how to use algorithms to solve
problems with the computer.
second one is representation
representation refers to
techniques to store data so that it can
be processed efficiently then we have
programming. Programming refers to
writing code or creating software while
writing code you will need to make use
of algorithms and understand how
computer is going to process your
instructions. That means you will have to
combine algorithmic thinking and your
knowledge of representations to create
software's or programs in a computer.
Design embodies above three areas.
To create hardware and software that is
easy to use and learn. When you say the
word computer
you mean the device that is used for
communicating maybe for
entertainment or maybe at work. we will
use the word computing system to refer
to a system that is used to solve
problems. A computing system is composed
of hardware software and data. Hardware
is any tangible or physical component
of the system while software is a set of
instructions given to computer to
perform a specific task and data is the
facts our figures that are provided by
the user that computer processes.
a computing system consists of the
following layers we will learn these
layers one by one from bottom to top in
this course from bottom first layer is
information then we have Hardware,
programming, operating system,
applications and communications
information inside a computer is
represented in the form of binary
numbers. that is digits 1 and zeros to
understand how computer processes
information, we must understand binary
number system and its relationship to
other number systems all types of
information text, numbers, images, audio,
video have to be represented in a
computer using specific methods. all this
information is to be converted to binary
numbers that is streams of bits zeroes
and 1 or binary digits Hardware refers
to the electrical or physical components
of the computer. A modern computers
circuitry is mainly built of transistors
transistors act like switches and
control the flow of electrical States in
a computer at fundamental level these
transistors are used to make logic gates
and eventually these logic gates are
used to make components like central
processing unit and memory of the
computer while computer hardware is
manufactured it is programmed in machine
and assembly languages a machine
language is built into the circuitry of
the computer it is based on instructions
written in ones and zeros form assembly
language uses words and symbols to give
instructions to the computer.
Instructions in assembly are translated
to machine language by a program called
assembler fourth layer in computing
system is operating system layer this
layer uses a software that manages
system resources this software is called
operating system examples of resources
can be processor memory and i/o devices
operating system combined with other
software's that make use of resources or
manage resources are combined into one
serve one software called system
software
common operating systems in use are
Windows, Mac OS, Linux etc. A user
interacts with computing system at
application layer using an application
software application software is used
for solving real world problems these
software's can be used to design
real-world objects play games solve
problems of a specific field
and do many other kinds of tasks almost
every area of computer technology uses
specific applications software's to make
use of computers. At communication layer
we will understand how computers
communicate with one another in order to
make communication possible using
computers, computers are connected to
networks. Internet which is the network
of networks is the result of advancement
in the computer communication technology.
worldwide web is the latest technology
which uses Internet to provide access to
all kinds of information.... abstraction is
one of the most important concepts in
computer science while using computer
you do not think about how it does its
various functions you often do not care
how software or Hardware of computer
works when you are using a computer
abstraction in computer science is the
process of hiding complex details and
showing information necessary to
accomplish a goal. For example to drive a
car we don't need to know how engine
works are how pedals or steering
work..... Information hiding is the same
concept as abstraction. while abstraction
is the external view of a system
information hiding refers to the
internal structure of a system or a
program. it is a technique that is
specifies how can one part of a program
or code can be hidden from another part
of program or code aim of information
hiding is to separate program parts to
hide sensitive details.... abstraction is
used at all the layers of computing
system discussed
previously... while working with one layer
of the computer system you don't have to
know how lower layer or upper layer in
computing system work... for example
while using internet or communicating
you haven't tried to think about how
computers connect, send and receive
information... understanding abstraction is
key to understanding the computer
science....
to understand computer science it is
necessary to learn about the history of
computers.. in simple terms computing
refers to the calculations performed
using arithmetic we will start the hisotry
of computers with very famous
calculating device of the past abacus....
abacus is used for recording numeric
values and performing basic arithmetic..
it was believed  to be first used by
Chinese.. it was common until early
20th century.. abacus had beads in rows
each row represented a specific value of
a digit while abacus was still
commonly used in 16th century.. a French
mathematician Blaise Pascal created a
mechanical calculator which performed
whole number addition and subtraction
Pascal's calculator performed only two
operation addition and subtraction a
German inventor gottfried wilhelm von
leibniz invented a mechanical
calculator.. which performed whole number
addition, subtraction, multiplication and
division.. in 18th century Joseph jacquard
invented loom for weaving clothes loom
used series of cards punched with holes
as input.. holes in cards used specific
colored threads for designing pattern in
clothes... although it was not a computing
device but it introduced idea of cards
as providing input.... in 19th century
Charles Babbage designed Difference
Engine and provided notes for designing
analytical engine.. This was the first
machine that introduced concept of
memory so that intermediate values did
not need to be re-entered.. making use of
punched cards from
jacquard's loom his machine could also
take inputs.. famous programmer Ada
Lovelace worked with Babbage on his
machines ... at the end of 19th century and
beginning of 20th century many inventors
appeared..... William Burroughs built and
sold a mechanical calculator.... DR. Herman
Hollerith
invented first electromechanical
tabulator
dr. Herman Hollerith later founded
famous company popularly known today as
IBM (international business machine).... until
the beginning of 20th century there was
no theoretical development in computing
devices ...before World War two Alan Turing
developed an abstract mathematical model
called Turing machine a Turing machine
uses predefined set of rules to
determine result from input values
although a theoretical concept but it
showed how computers can be used to
solve complex problems.... after Alan
Turing's mathematical model theoretical
and practical development started to
happen at the same time in the field of
computer science.... In 1937,
George Stibitz constructed binary adder
using relays at Bell Labs... A relay is an
electronic switch which provides the
control signal based on number of inputs...
In the same year Claude e Shannon
published a paper which discussed
implementing symbolic logic using relays
Konrad zuse a German inventor built
first mechanical binary programmable
computer in 1938... in 1943 Thomas Flower
built programmable electronic digital
computer..... in 1944 Harvard mark 1 was
introduced...... ENIAC electronic numerical
integrator and computer was introduced
in 1946.. this was the first computer in
history... john von neumann served as a
consultant in building of ENIAC... in 1950
EDVAC was introduced.. it was followed by
first commercial computer called UNIVAC..
UNIVAC was used to predict election
results... mathematician John von Neumann
also worked on these machines... From this
point on, history of computer hardware is
categorized in to GENERATIONS ..this was
the time when computers used vacuum
tubes and relays
in 1950s first commercial computers were
introduced this was the first generation
of modern computers... which has started
from 1951 to 1959 computers in this
generation used vacuum tubes to store
information.. primary memory device was
magnetic drum that rotated under a read or
write head.... input device was a card
reader that read the holes in IBM cards..
these are the same cards used in jacquard's
loom. output device were punched cards and
line printers... in later years of this
generation magnetic tapes were
introduced... magnetic tapes stored data
sequentially.. magnetic tape was first
auxiliary or external storage device for
a computer ... word peripherals was
first used in this generation for
input/output and auxiliary storage
devices..... second generation of computers
is counted from 1959 to 1965 ... in this
generation of computers biggest
invention was transistor.. transistor
replaced vacuum tubes in computers ....
it was invented by John
Bardeen, Walter H Brattain and William B
Shockley computer memory in this
generation was made from magnetic cores..
each magnetic core stored one bit of memory..    cores
were strung together to form memory
cells.... these cells made immediate excess
of memory possible that was not possible
with drums... in this generation auxiliary
storage device magnetic disk was
introduced it is faster than magnetic
tape.. in magnetic tapes you have to go
through every piece of information
sequentially to access specific
information.. in magnetic disk each piece
of data has its own address read or
write head immediately excess
information based on the address of a
cell .... third generation of computers
starts from 1965 to 1971 in this
generation integrated circuit chips were
used all the components, along with
transistors of a computer are
interconnected on a single silicon chip
Gordon Moore, a computer scientist, noticed
that each year number of circuits placed
on IC were doubling.. His observation is
known as Moore's Law ..in third-generation
transistor became number one choice in
making computer components.. logic gates
built with transistors were used in
building memory and other circuits of
computer... although memory built with
transistors was very really fast but it
was volatile so auxilary storage devices
were still needed.. terminal an input and
output device with the keyboard was
introduced in this generation... fourth
generation of computers starts from 1971..
this generation was characterized by
large-scale integration of transistors
on the chip.. from individual transistors
on circuit board, whole micro processors
or microcomputer chips were embedded on
a circuit board .. phrase 'personal computer'
also entered into the dictionary in this
generation... new tech companies in this
generation were Apple, Atari, Commodore
and SUN (Micro-systems)... in this generation popular
computers were IBM PC and Apple's
Macintosh... large computers also known as
workstations which were connected to
multiple terminals were also
introduced in this generation
in this generation computers internal
instruction set was also modified... RISC
architecture (reduced instruction set
computer) was also introduced in this
generation... In this generation Moore's Law
was modified twice... in third generation
Moore's law stated that number of
transistors on a chip doubled each year
but in fourth generation because chips
were integrated on circuit boards so
"chip density kept doubling after
every 18 months"...
latest development in computer science
it has been noted that
"computers will either double in power at
the same price or halve in cost for the
same power every 18 months"...(Moore's law)
since computers are still produced using
integrated circuit chips, there is no big
difference between fourth and fifth
generation but currently number of
transistors have increased tremendously
on a single chip from a few thousand
transistors, modern computers have
millions of transistors on a single chip
it is also known as very large-scale
integration or ultra large integration
of transistors besides ultra large
integration of transistors on a chip
modern computers come with multiple
chips in a single processor (cores) this
phenomena is also known as parallel
processing in computers because more
than one processors can work on a single
task at the same time.... and artificial
intelligence is a very
hot topic of research in this
generation..... since the first commercial
computer software has evolved more than
the hardware of the computer
understanding evolution of software is
necessary to understand the working of
modern software...  first generation of
software is considered to occur at the
same time as the first generation of
computer devices.... instructions were
written in binary numbers, also known as
machine language of the computer... in this
generation these languages are built
into hardware, while hardware
is manufactured .... since
writing programs in machine language was
harder, assembly languages were developed
to make writing programs easier...these
languages used mnemonics to write
instructions...... instructions were
translated into machine language by a
program called translator.... assembly
languages acted as a
buffer between the programmer and
machine hardware in this generation a
system programmer and application
programmer were the same jobs ...only the
computer people who could program
computers were the users
of the computer.... second generation of
software came with human-like computer
languages, known as high-level languages..
popular languages developed in this
generation were Fortran, which stands for
formula translation,
it was developed for mathematical
purposes... COBOL, common business oriented,
language, was developed for
business purposes and Lisp which stands
for list processing was created for
artificial intelligence purposes... in this
generation system programmers and
application programmers had different
jobs... .....this picture represents
concept of abstraction in computer
science (with respect to programming) that is programmers working with
high-level languages don't have to care
about how assembly language or machine
language in a device works... in first and
second generation computer resources
were of no concern to system programmers
because instructions for each piece of
hardware were written separately..... in
third generation to make computers more
efficient, computer resources were put
under the control of computer...... to do that
a software called operating system was
developed to manage computer resources
utility programs linkers loaders
translators and operating system were
combined and named as system software...
utility programs are programs designed
for general support of the processes of
a computer ..... a loaders job is to load
programs into memory and linker links
pieces of large programs together.... an
assembler translates assembly language
program into machine language and a
compiler translates a high level
language code into the code that can
machine understand..... idea of time-sharing
systems emerged in this generation
because of operating system.. many users
with terminals could use the same system..
often this system was large machine like
workstation... general-purpose
application programs in this generation
were written like statistical
package for the Social Sciences SPSS... in
third generation system programmer,
application programmer and user were
three different entities in the world of
computers... this picture represents how
system software provided abstraction to
the user and the application programmer
in fourth generation of computers
software structured
programming technique was introduced
pascal, modula-2
basic and C were popular structured
programming languages .... C++ was also
introduced in 1980s it was a structured
programming language with a bit of extra
object-oriented functionalities..... popular
operating systems in this generations
were UNIX, PC DOS for IBM computers and
ms-dos for many general computers in
this generation first time Mouse
was introduced.. Apple introduced
its point-and-click graphical user
interface that used a
mouse (for input)..... in this generation popular regular
user application software were spreadsheet
programs like Lotus 1-2-3 and first word
processing program like WordPerfect and
a database small database program like
dBaseIV........modern day these
software's are bundled into suit called
OFFICE...MS Office, Libre Office and Open
Office are some examples
in fifth generation object-oriented
programming, world wide web and Microsoft
software technologies emerged... popular
programming languages in this generation
were Java and C++ ....Tim burners-lee created
hypertext markup language (HTML) in 1991... which
became the foundation of modern world
wide web.. mosaic later known as Netscape
was first browser in this era...  modern-day
Firefox browser is based on the first
netscape or mosaic browser......... in
fifth-generation self-published writer's
emerged because of blog enabled
web sites like blogger and WordPress ... and
dynamic content creation websites like
Wikipedia YouTube and many others allow
users to create and share content
related to education, entertainment and
hobbies..... social networking sites like
Facebook, Twitter have been used by
billions of people to share text and
graphics information using computers...
from first generation role of user in
computers  have always
been changing.. first-generation user was
a programmer.. in second generation system
programmer wrote tools for application
programmers.... in third generation
application programmers used tools
written by system programmers to write
programs for non-programmers
our users........and in fourth generation
computer became popular among all types
of users and in fifth generation besides
using computer for different purposes
computer users started to create content
for other users
that's all for today's lecture in this
lecture we introduced computer science
learned about the history of computer
devices and history of computer software
THANKS FOR WATCHING..
