Generation of computer
Introduction
The development of electronic computers had clearly helped
to visualize the concept of computer.
The computer system has taken a big leaf forward with each technological
breakthrough during the development process. The functions performed and speed
of computer in continuously changing. There is great variations in size and
cost and great breakthrough in hardware and software which result more and more
advance computer. In depth, there are altogether six major stages in the
continuous development process of the computer. These stages are called
generations of computers.
1. First Generation Computers
(1937-1953)
The computers developed during 1937 to 1953 are known as
first generation computer. This generation is characterized by vacuum tube.
Vacuum tube is electronic component for their hardware component. The size of
this generation computer was very high. Following are the important features of
first generation computers.
- Vacuum tubes were used for
electronic and magnetic drums which were used for primary storage medium.
- Storage capacity was limited (e.g.
1 Kilobytes to 4 Kilobytes)
- The operating speed was slow.
(e.g. in millisecond)
- Large in size.
- This generation computer use low
level language or machine level language.
- These computers were used for
scientific calculations and record keeping.
- They produce more heat during
operation.
- These computers need more
electricity to work.
Three machines have been promoted at various times as the
first electronic computers. These machines used electronic switches, in the
form of vacuum tubes, instead of electromechanical relays. In principle the
electronic switches would be more reliable, since they would have no moving
parts that would wear out, but the technology was still new at that time and
the tubes were comparable to relays in reliability. Electronic components had
one major benefit, however: they could "open" and "close"
about 1,000 times faster than mechanical switches.
The earliest attempt to build an electronic computer was by
J.V. Atanasoff, a professor of physics and mathematics at Lowa state in 1937.
Atanasoff set out to build a machine that would help his graduate students
solve systems of partial differential equations. By 1941 he and graduate
student Clifford Berry had succeeded in building a machine that could solve 29
simultaneous equations with 29 unknowns. However, the machine was not
programmable, and was more of an electronic calculator.
The first general purpose programmable electronic computer
was the Electronic Numerical Integrator and Computer (ENIAC), built by J.
Presper Eckert and John V. Mauchly at the University of Pennsylvnia. Another
new concept developed was EDVAC (Electronic Discrete Variable Automatic
Computer). The main contribution of EDVAC was, the notion of stored program.
EDVAC was able to run orders of magnitude faster than ENIAC. By storing
instructions in the same medium as data, designers could concentrate on improving
the internal structure of the machine without worrying about matching it to the
speed of an external control.
Software technology during this period was very primitive.
The first programs were written out in machine code, i.e. programmers directly
wrote down the numbers that corresponded to the instructions they wanted to
store in memory. By the 1950s programmers were using a symbolic notation, known
as assembly language, then hand. Translation the symbolic notation into machine
code. Later programs known as assemblers performed the translation task.
2. Second Generation (1954-1962)
In 1948-1949, scientists invented another electronic
component called transistor which was used instead of vacuum tube in first
generation computers. The compute using transistor as storage media are
classified as Second Generation Computers. One transistor could do task.
The second generation saw several important developments at
all levels of computer system design, from the technology used to build the
basic circuits to the programming languages used to write scientific
applications. Important innovations in computer architecture include index
register for controlling loops and floating point units for calculations based
on real numbers.
During this generation many high level programming languages
were introduced, including FORTRAN (Formula Translation) in 1956, ALGOL (Algorithm
Language) in 1958, and COBOL (Common Business Oriented Language) in 1959.
Important commercial machines of this era include the IBM 704 and its successors,
the 709 and 7049. The later introduced I/O processors for better throughput
between I/O devices and main memory.
The second generation also saw the first two supercomputers
designed specifically for numeric processing in scientific applications. The term
"supercomputer" is generally reserved for a machine that is an order
of magnitude more powerful than other machines of its era. Two machines of the
1950s deserve this title. The Livermore Atomic Research Computer (LARC) and the
IBM 7030 (aka Stretch) were early examples of machines that overlapped memory
operations with processor operations and had primitive form of parallel
processing.
3. Third Generation (1963-1972)
The third generation brought huge gains in computational
power. Innovations in this era include the use of integrated circuits, or ICs
(semiconductor devices with several transistors built into one physical
component), semiconductor memories starting to be used instead of magnetic
cores, microprogramming as a technique for efficiently designing complex processors,
the coming of age of pipelining and other forms of parallel processing and the
introduction of operating systems and time-sharing.
The first ICs were based on small-scale integrations (SSI)
circuits, which had around 10 device per circuit (or "chip"), and
evolved to the used of medium-scale integrated
(MSI) circuits, which had up to 100 devices per chip. Multilayered
printed circuits were developed and core memory was replaced by faster, solid
state memories.
This generation computer can perform parallel processing
perfectly which cause fast processing in computer system. In parallel
processing, more than one process can be performed at same time. Another important
feature of this generation computer was multiprocessing and multiprogramming
where multiprocessing means more than one process can be process can be
performed at same time by same processor where as multiprogramming means more
than one programming can run at same time by same processor.
In this third generation, Cambridge and the University of
London cooperated in the development of CPL (Combined Programming Language,
1963). CPL was, according to its authors, and attempt to capture only the
important features of the complicated and sophisticated ALGOL. However, like
ALGOL, CPL was large with many features that were hard to learn. In an attempt
at further simplification, Martin Richards of Cambridge developed a subset of
CPL called BCPL (Basic Combined Programming Language) in1967. In 1970 Ken
Thompson of Bell Labs Developed yet another simplification of CPL called simply
B, in connection with an early implementation of the UNIX operating system.
4. Fourth
Generation (1972-1984)
The next generation of computer systems saw the use of large
scale integration (LSI) 1000 devices per chip and very large scale integration (VLSI
- 100,000 devices per chip) in the construction of computing elements. At this
scale entire processors will fit onto a single chip, and for simple systems the
entire computer (processor, main memory and I/O controllers) can fit on one
chip. Gate delays dropped to about 1 ns (nanosecond) per gate.
During this generation computer microprocessor was
developed. Microprocessor is that type of chip where AL, Control Unit and Main
memory and related small memories are integrated inside single chip.
Semiconductor memories replaced core memories as the main
memory in most systems; unit this time the use of semiconductor memory in most
systems was limited to register and cache. During this period high speed vector
processors, such as the CRAY1, CRAY X-MP and CYBER 205 dominated the high
performance computing scene. Computers with large main memory, such as the CRAY
2, began to emerge. A variety of parallel architectures began to appear;
however, during this period the parallel computing efforts were of a mostly
experimental nature and most computational science was carried out on vector
processors microcomputers and workstations were introduced and saw wide use as
alternatives to time shared mainframe computers.
Developments in software include very high level languages
such as FP (Functional Programming) and Prolog (Programming in logic). These
languages tend to use a declarative programming style as opposed to the
imperative style of Pascal, C, FORTRAN, etc. two important events marked the
early part of the third generation: the development of the C programming
language and the UNIX operating system, both at Bell Labs. In 1972, Dennis
Ritchie, seeking to meet the design goals of CPL and generalize Thompson's B,
developed the C language. Thompson and Ritchie then used C to write a version
of UNIX for the DEC-11.
5. Fifth Generation (1984-1990)
The development of the next generation of computer systems
is characterized mainly by the acceptance of parallel processing. Until this
time parallelism was limited to pipelining and vector processing, or at most to
a few processors sharing jobs. The fifth generation saw the introduction of machines
with hundreds of processors that could all be working o different parts of a
single program. The scale of integration in semiconductors are continued at an
incredible pace - by 1990 it was possible to build chips with a million
components and semiconductor memories became standard on all computers.
Other new developments were the widespread use of computer
networks and the increasing use of single-user workstations. Prior to1985 large
scale parallel processing was viewed as a research goal, but two systems
introduced around this time are typical of the first commercial products to be
based on parallel processing.
Computer network is another new concept used in this
generation computer. Computer network means connection of computer with each
other to exchange information and to share the resources.
Intel
connected each processor to its own memory and used a network interface to
connect processors. This distributed
memory architecture meant memory was no longer a large systems (using more
processors) could be built. Toward the end of this period a third type of
parallel processor was introduced to the market. In this style of machine,
known as data-parallel or SIMD, there
are several thousand very simple processors. All processors work under the
direction of a single control unit; i.e. if the control unit says 'add a to
b" then all processors find their local copy of a and add it to their
local copy of b.
Scientific
computing in this period was still dominated by vector processing. Most
manufactures of vector processors introduced parallel models, but there were
very few (two to eight) processors in this parallel machine. In the area of
computer networking, both Wide Area Network
(WAN), Local Area Network (LAN) technology developed at a
rapid pace, stimulating a transition from the traditional mainframe computing
environment toward a distributed computing environment in which each user has
their own workstation for relatively simple tasks (editing and compiling
programs, reading mail) but sharing large, expensive resources such as file
servers and supercomputers. RISC (Reduced Instruction Set Computer) technology
( a style of internal organization of the CPU) and plummeting costs for RAM
brought tremendous gains in computational power of relatively low cost
workstations and servers. This period also saw a market increase in both the
quality and quantity of scientific visualization.
6. Sixth Generation (1990-)
Transitions between generations in computer technology are
hard to define, especially as they are taking place. Some changes, such as the
switch from vacuum tubes to transistors, are immediately apparent as
fundamental changes. But others are clear only in retrospect. Many of the
developments in computer systems since 1990 reflect gradual improvements over
established systems, and thus it is hard to claim they represent a transition
to a new "generation", but other developments will prove to be
significant changes.
This generation is beginning with many gains in parallel
computing, both in the hardware area and in improved understanding of how to
develop algorithms to exploit diverse, massively parallel architectures.
Parallel systems now complete with vector processors in terms of total
computing power and most expect parallel systems to dominate the future.
Workstation technology has continued to improve, with
processor designs now using a combination of RISC, pipelining, and parallel
processors. As a result it is now possible to purchase a desktop workstation
for about $30,000 that has the same overall computing power (100 megaflops) as
fourth generation supercomputers. This development has sparked an interest in
heterogeneous computing: a program started on one workstation can fine idle
workstations elsewhere in the local network to run parallel subtasks.
The expected features of computers are natural language
processing, artificial intelligence, problem solving techniques, pattern
recognition and speech recognition.
No comments:
Post a Comment