Sunday 29 January 2012

Generations of Computers:


First Generation (1940-1956) Vacuum Tubes:

1.The first computers used vacuum tubes for circuitry and magnetic drums for memory
2.They were often enormous, taking up entire rooms. 
3.They were very expensive to operate and in addition to using a great deal of electricity, 
4.generate a lot of heat, which was often the cause of malfunctions.
5.First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations,
6. The UNIVAC and ENIAC computers are examples of first-generation computing devices. 


Second Generation (1956-1963) Transistors:

1.Transistors replaced vacuum tubes and was far superior to the vacuum tube, 
2.allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable 
 3.transistor still generated a great deal of heat that subjected the computer to damage, 
4.Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, 5.which allowed programmers to specify instructions in words.

Third Generation (1963-1971) Integrated Circuits:

1.Transistors were placed on silicon chips, called semiconductors,
 2.drastically increased the speed and efficiency of computers.
 3.allowed the device to run many different applications at one time with a central program 
4.smaller and cheaper than their predecessors.


Fourth Generation (1971-Present) Microprocessors

:


1.The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip.

2.As these small computers became more powerful, they could be linked together to form networks

3.led to the development of the Internet. 
4.In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh.



Fifth Generation (Present and Beyond) Artificial Intelligence

:


1.Fifth generation computing devices, based on artificial intelligence


2.The use of parallel processing and superconductors is helping to make artificial intelligence a reality


 3.Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. 





  DID YOU KNOW...?


An integrated circuit (IC) is a small electronic device made out of a semiconductor material. The first integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor.