Tuesday, 8 July 2014

History of computer,GENERATION OF COMPUTER

                           History of computer,GENERATION OF COMPUTER


Computer word is derived from “Computing”. As the start of the modern science that we call "Computer Science" can be traced back to a long ago age where man still dwelled in caves or in the forest, and lived in groups for protection and survival from the harsher elements on the Earth.

It was a man who decided when to hold both the secret and public religious ceremonies, and interceded with the spirits on behalf of the tribe. In order to correctly hold the ceremonies to ensure good harvest in the fall and fertility in the spring, the shamans needed to be able to count the days or to track the seasons. From the shamanistic tradition, man developed the first primitive counting mechanisms -- counting notches on sticks or marks on walls.

Computing becoming more and more complicated then the first computing device came in to being that is Abacus


The first actual calculating mechanism known to us is the abacus, which is thought to have been invented by the Babylonians sometime between 1,000 BC and 500 BC, although some pundits are of the opinion that it was actually invented by the Chinese.

The word abacus comes to us by way of Latin as a mutation of the Greek word abax. In turn, the Greeks may have adopted the Phoenician word abak, meaning "sand", although some authorities lean toward the Hebrew word abhaq, meaning "dust."

Irrespective of the source, the original concept referred to a flat stone covered with sand (or dust) into which numeric symbols were drawn. The first abacus was almost certainly based on such a stone, with pebbles being placed on lines drawn in the sand. Over time the stone was replaced by a wooden frame supporting thin sticks, braided hair, or leather thongs, onto which clay beads or pebbles with holes were threaded.
A variety of different types of abacus were developed, but the most popular became those based on the bi-quinary system, which utilizes a combination of two bases (base-2 and base-5) to represent decimal numbers. Although the abacus does not qualify as a mechanical calculator, it certainly stands proud as one of first mechanical aids to calculation.

John Napier developed the logarithms rules which are very useful in mathematics and computer technology. He was a Scottish mathematical scientist. The Logarithm table is designed by Napier as well which make revolutionary change in mathematics and Computing
Napier's invention led directly to the sliderule first built in England in 1632 and still in use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs which landed men on the moon. This slide rules is used to take sin, cos, tangent and other trigonometric & arithmetic calculation.

In 1642 Blasé Pascal, at age 19, invented the Pascaline as an aid for his father who was a tax collector. Pascal built 50 of this gear-driven one-function calculator (it could only add) but couldn't sell many because of their exorbitant cost and because they really weren't that accurate (at that time it was not possible to fabricate gears with the required precision).
Up until the present age when car dashboards went digital, the odometer portion of a car's speedometer used the very same mechanism as the Pascaline to increment the next wheel after each full revolution of the prior wheel. Pascal was a child prodigy. At the age of 12, he was discovered doing his version of Euclid's thirty-second proposition on the kitchen floor. Pascal went on to invent probability theory, the hydraulic press, and the syringe. Shown below is an 8 digit version of the Pascaline, and two views of a 6 digit version:


The great polymath Gottfried Leibniz was one of the first men, who dreamed for a logical (thinking) device. Even more Leibniz tried to combine principles of arithmetic with the principles of logic and imagined the computer as something more of a calculator—as a logical or thinking machine.
He discovered also that computing processes can be done much easier with a binary number coding. He even describes a calculating machine which works via the binary system: a machine without wheels or cylinders—just using balls, holes, sticks and canals for the transport of the balls.


Joseph Marie Jacquard (1752-1834) was a French silk weaver and inventor, who improved on the original punched card design of Jacques de Vaucanson's loom of 1745, to invent the Jacquard loom mechanism in 1804-1805. Jacquard's loom mechanism is controlled by recorded patterns of holes in a string of cards, and allows, what is now known as, the Jacquard weaving of intricate patterns.


Charles Xavier Thomas de Colmar invented the first calculating machine to be produced in large numbers. This invention came about in France in 1820 as part of a national competition and the machine was called the Arithmometer.


The Arithmometer was essentially an early and large version of a pocket calculator (occupying the best part of a desk), and by 1845 there was a large, commercially successful industry involved in the manufacture of these machines.


The first glimmer of a "thinking machine" came in the 1830s when British mathematician Charles Babbage envisioned what he called the analytical engine. Charles Babbage is s considered as “Father Of Computing”.  Babbage was a highly regarded professor of mathematics at Cambridge University when he resigned his position to devote all of his energies to his revolutionary idea.

In Babbage's time, the complex mathematical tables used by ship's captains to navigate the seas, as well as many other intricate computations, had to be calculated by teams of mathematicians who were called computers.
No matter how painstaking these human computers were, their tables were often full of errors. Babbage wanted to create a machine that could automatically calculate a mathematical chart or table in much less time and with more accuracy.
 His mechanical computer, designed with cogs and gears and powered by steam, was capable of performing multiple tasks by simple reprogramming—or changing the instructions given to the computer. 

LADY AUGUSTA ADA (1816-1852):
Lady Augusta Ada is mainly known for having written a description of Charles Babbage's early mechanical general-purpose computer, the analytical engine. Ada was a US governmental developed programming language. The standard was originally known as Ada83, but this is now obsolete, as it was recently "overhauled" and re-born as Ada95. This is now the preferred standard and implementation of the Ada programming language.

HermanHollerith developed in 1890 the punched card system to store data. The punched card system was an important movement in the development of the computer. His idea was totally different from the principle already known by Babbage or by Colmar. He used the working method of a punch cutter on the train. His calculator was so successful that he started his own business to sell his product. Later the company was called International Business Machines (IBM). However the original cards could not be used for complicated calculations.

Atanasoff Berry Computer is the name given, long after the fact, to an experimental machine for solving systems of simultaneous linear equations, developed in 1938-42 at Iowa State University by Dr. John Vincent Atanasoff and Clifford E. Berry. It is sometimes referred to by its initials, ABC.
The Atanasoff-Berry Computer, constructed in the basement of the Physics building at Iowa State University, took over two years to complete due to lack of funds. The prototype was first demonstrated in November of 1939. The computer weighed more than seven hundred pounds (320 kg). It contained approximately 1 mile (1.6 km) of wire.

English mathematician George Boole sets up a system called Boolean algebra,, wherein logical problems are solved like algebraic problems. Boole's theories will form the bedrock of computer science.

The creation of an algebra of symbolic logic was the work of another mathematical prodigy and British individualist. . As Bertrand Russell remarked seventy years later, Boole invented pure mathematics. The design of circuits is arranged by logical statements and these statements return Zero (0) or one (1). This is called binary language

MARK-I, ASCC (1944):
The Harvard Mark I designed primarily by Prof. Howard Aiken launches today's computer industry. The Mark I is the world's first fully automatic computer and the first machine to fulfill Babbage's dream. 1945
A programmable, electromechanical calculator designed by professor Howard Aiken. Built by IBM and installed at Harvard in 1944, it strung 78 adding machines together to perform three calculations per second. It is also known as ASCC (Automatic Sequence Controlled Calculator). It was 51 feet long, weighed five tons and used paper tape for input and typewriters for output. Made of 765,000 parts, it sounded like a thousand knitting needles The Mark I worked in decimal arithmetic, not binary, but it could go for hours without intervention.


ENIAC (1943-1946):
ENIAC stands for Electronic Numerical Integrator and Computer. The first operational electronic digital computer developed for the U.S. Army by J. Presper Eckert and John Mauchly at the University of Pennsylvania in Philadelphia. Started in 1943, it took 200,000 man-hours and nearly a half million dollars to complete two years later.
Programmed by plugging in cords and setting thousands of switches, the decimal-based machine used 18,000 vacuum tubes, weighed 30 tons and took up 1,800 square feet. It cost a fortune in electricity to run; however, at 5,000 additions per second,
It was faster than anything else. Initially targeted for trajectory calculations, by the time it was ready to go, World War II had ended. Soon after, it was moved to the army's Aberdeen Proving Grounds in Maryland where it was put to good work computing thermonuclear reactions in hydrogen bombs and numerous other problems until it was dismantled in 1955.


ENVAC (1946-1952):
In 1944, while working as a research associate at the Moore School, Dr John Von Neumann worked on the EDVAC (Electronic Discrete Variable Automatic Computer), greatly advancing the functions of its predecessor. Completed in 1952, EDVAC had an internal memory for storing programs, used only 3,600 vacuum tubes, and took up a mere 490 square feet (45 sq. m).
He undertook a study of computation that demonstrated that a computer could have a simple, fixed structure, yet be able to execute any kind of computation given properly programmed control without the need for hardware modification.

Von Neumann contributed a new understanding of how practical fast computers should be organized and built; these ideas, often referred to as the stored-program technique, became fundamental for future generations of high-speed digital computers and were universally adopted.


EDSAC (1946-1952):
EDSAC stands for Electronic Delay Storage Automatic Calculator, was an early British computer. The machine, having been inspired by John von Neumann's seminal EDVAC report, was constructed by Professor Sir Maurice Wilkes and his team at the University of Cambridge Mathematical Laboratory in England.
EDSAC was the world's first practical stored program electronic computer, although not the first stored program computer (that honor goes to the Small-Scale Experimental Machine).
The project was supported by J. Lyons & Co. Ltd., a British firm, who were rewarded with the first commercially applied computer, LEO I, based on the EDSAC design. EDSAC ran its first programs on May 6, 1949, calculating a table of squares and a list of prime numbers

UNIAC-I (1951):
UNIVAC I. First-generation computer was characterized by a very prominent feature on the ENIAC, vacuum tubes. Until the year 1950, several other computers using these tubes, each computer provides significant advances in computer development. Development includes arithmetic binary, random access, and the concept of stored programs.
1951 The U.S. Bureau of Census in 1951 installed the first commercial computer called the Universal Automatic Computer – UNIVAC I. UNIVAC I developed by Mauchly and Eckert for the Remington-Rand Corporation.
The first IBM products are sold in the market is the IBM 701 in 1953. Remarkably, the IBM 650 was introduced in the next year that may be the reason IBM is a big benefit in the previous year. To get rid of its competitors, the IBM 650 was made in order to upgrade the machine-punched-card machines available. That’s because IBM 650 data processing in a way similar to the traditional way of punched-card machines.


The history of computer development is often referred to in reference to the different generations of computing devices. A generation refers to the state of improvement in the product development process. This term is also used in the different advancements of new computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it.
As a result of the miniaturization, speed, power, and computer memory has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play.
Each generation of computers is characterized by major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, and more powerful and more efficient and reliable devices.
Read about each generation and the developments that led to the current devices that we use today. The generations which are divided in to fifth categories can be describe as:

Early Period
1000 BC-1940
Many As describe in previous Chapter
First Generation
Vacuums Tube
Second Generation
Third Generation
Integrated Circuits (ICs)
Forth Generation
Since 1975
Microprocessor/Large Scale Integration
Fifth Generation
Since 1980
Artificial Intelligence


THE EARLY DAYS (1,000 B.C. TO 1940)

Computers are named so because they make mathematical computations at fast speeds. As a result, the history of computing goes back at least 3,000 years ago, when ancient civilizations were making great strides in arithmetic and mathematics. The Greeks, Egyptians, Babylonians, Indians, Chinese, and Persians were all interested in logic and numerical computation. The Greeks focused on geometry and rationality, the Egyptians on simple addition and subtraction, the Babylonians on multiplication and division, Indians on the base-10 decimal numbering system and concept of zero, the Chinese on trigonometry, and the Persians on algorithmic problem solving.
These developments carried over into the more modern centuries, fueling advancements in areas like astronomy, chemistry, and medicine.
(All other history from abacus to UNIVAC-I describe in previous Chapter)

FIRST GENERATION (1942 - 1955)

The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. First generation computers relied on machine language to perform operations, and they could only solve one problem at a time.
The Mark-I, EDSAC, EDVAC, UNIVAC-I and ENIAC computers are examples of first-generation computing devices. It was very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
Vacuum tubes used to calculate and store information, these computers were also very hard to maintain. First generation computers also used punched cards to store symbolic programming languages. Most people were indirectly affected by this first generation of computing machines and knew little of their existence.

1.     After long history of computations, the 1G computers are able to process any tasks in milliseconds.
2.     The hardware designs are functioned and programmed by machine languages (Languages close to machine understanding).
3.     Vacuum tube technology is very much important which opened the gates of digital world communication.
1.     Size of that machines are very big
2.     Required large amount of energy for processing
3.     Very expensive
4.     Heat generated and need air conditioning.
5.     Not portable ( never take from one place to other)
6.     Comparing with 5G computers, these computers are slow in speed.
7.     Not reliable
8.     In order to get proper processing, maintenance is required continuously.


Transistors replaced vacuum tubes and ushered in the second generation computer. Transistor is a device composed of semiconductor material that amplifies a signal or opens or closes a circuit. Invented in 1947 at Bell Labs, transistors have become the key ingredient of all digital circuits, including computers. Today's latest microprocessor contains tens of millions of microscopic transistors.
Prior to the invention of transistors, digital circuits were composed of vacuum tubes, which had many disadvantages. They were much larger, required more energy, dissipated more heat, and were more prone to failures. It's safe to say that without the invention of transistors, computing as we know it today would not be possible.

The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry.
IBM 7074 series, CDC 164, IBM 1400 Series.
1.     If we compare it with G1 computer, less expensive and smaller in size.
2.     Fast in speed
3.     Less head generated as G1 computers generate more.
4.     Need low power consumption
5.     Language after machine language for programming, in G2 assembly language (COBOL, FORTRON) is introduced for programming.
6.     Portable.


1.     Maintenance of machine is required.
2.     Air conditioning required still as heat causes to process slowly.
3.     These computers are not used as personal system.
4.     Preferably used for commercial purposes


THIRD GENERATION (1964 - 1975)

The development of the Integrated Circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.


IBM System/360 & IBM 370, PDP-8, DEC, UNIVAC 1108, UNIVAC 9000.

1.      Smaller in size
2.      Low cost then previous
3.      Low power consumption
4.      Easy to operate
5.      Portable
6.      Input devices introduced and that make user easy to interact with it like keyboard, mouse etc
7.      External Storage medium introduced like floppy & tape.

1.     IC chips are still difficult to maintain
2.     Need complex technology.


The Microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand.
The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers
and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks
, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

Intel processors, AMD processor based machines
1.      Smaller in size
2.      High processing speed
3.      Very reliable
4.      For general purpose
5.      More external storage mediums are introduced like CD-ROM, DVD-ROM.
6.      GUIs developed for interaction


Fifth generation computing devices, based on Artificial Intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today.
The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come.
The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

ULAIC Technology, Artificial intelligence etc
1.      Program independent
2.      Have thinking and analysis by its own
3.      Voice reorganization & biometric devices
4.      Self organization and learning


This is really Helpful.. Thankx a lot

Thank you for your great post. It's really very informative and really helpful. Please Keep posting. Thanks again.
23traders Tutor

People who are interested to join the midbrain activation course in India they can approach to the Rajmin Academy.For more information, visit our website.

good explanation proceed with your job

Post a Comment