- First generation (1945-1956)
- Second generation (1956-1963)
- Third generation (1964-1971)
- Fourth generation (1971-present)
- Fifth generation (present-future)
- References
Each of the five generations of the computer is characterized by an important technological development that had an innovative change in the way computers operate.
Computers play an important role in almost every aspect of human life, but computers as we know them today are very different from the initial models.
Computer / computer from the 1950s. United States.
But what is a computer? A computer can be defined as an electronic device that performs arithmetic and logical operations.
Another popular definition can say that a computer is a device or machine that can process certain material to convert it into information.
To understand the basic functioning of a computer it is necessary to define the data, the processing and the information.
Data is a collection of basic elements that exist if no sequence; by themselves they have no meaning.
Processing is the process by which information can be extracted from data. And finally, information is the final element of any processing job.
The first electronic computer was invented in 1833; it was the first device to have an analytical engine.
As time went by, this device became a reliable machine that was capable of doing jobs more quickly. Thus was born the first generation of computers with the ENIAC machine.
First generation (1945-1956)
The vacuum tube is associated as the main technology of the first generation of computers; They are glass tubes that contain electrodes.
These tubes were used for the circuits of the first computers. Additionally, these machines used magnetic drums in their memory.
The vacuum tube was invented in 1906 by an electrical engineer. During the first half of the 20th century, this was the main technology used to build radios, televisions, radars, X-ray machines, and other electronic devices.
The first generation machines were generally controlled with control panels with a wiring or by a series of addresses encoded on paper tapes.
They were very expensive, consumed a lot of electricity, generated a lot of heat and were huge (often taking up entire rooms).
The first electronic operational computer was called ENIAC and used 18,000 vacuum tubes. It was built in the United States, at the University of Pennsylvania, and was about 30.5 meters long.
It was used for temporary calculations; It was mainly used in calculations related to war, such as operations related to the construction of the atomic bomb.
On the other hand, the Colossus machine was also built during these years to help the English during World War II. It was used to decode secret messages from the enemy and used 1,500 vacuum tubes.
While these first generation machines were programmable, their programs were not stored internally. This would change as stored program computers were developed.
First-generation computers relied on machine language, the lowest programming language understood by computers to perform operations (1GL).
They could only solve a single problem at a time, and it could take operators weeks to schedule a new problem.
Second generation (1956-1963)
The second generation of computers replaced vacuum tubes with transistors. Transistors allowed computers to be smaller, faster, cheaper, and more efficient at the level of energy consumed. Magnetic disks and tapes were often used to store data.
Although the transistors generated enough heat to do some damage to computers, they were an improvement on previous technology.
Second-generation computers used cooling technology, had a wider commercial use, and were only used for specific business and scientific purposes.
These second-generation computers left behind the cryptic binary machine language to use an assembly language (2GL). This change allowed programmers to specify instructions in words.
During this time, high-level programming languages were also being developed. Second-generation computers were also the first machines to store instructions in memory.
By the time, this element had evolved from magnetic drums to a technology with a magnetic core.
Third generation (1964-1971)
The hallmark of the third generation of computers was integrated circuit technology. An integrated circuit is a simple device that contains many transistors.
The transistors got smaller and were placed on silicone chips, called semiconductors. Thanks to this change, computers were faster and more efficient than those of the second generation.
During this time, computers used third generation languages (3GL), or high-level languages. Some examples of these languages include Java and JavaScript.
The new machines of this period gave rise to a new approach to computer design. It can be said that it introduced the concept of a single computer over a range of other devices; a program designed to be used on one family machine could be used on the others.
Another change from this period was that now the interaction with computers was done through keyboards, a mouse and monitors with an interface and an operating system.
Thanks to this, the device could run different applications at the same time with a central system that took care of the memory.
The IBM company was the creator of the most important computer of this period: the IBM System / 360. Another model from this company was 263 times faster than the ENIAC, demonstrating the great advance in the field of computers until then.
Because these machines were smaller and cheaper than their predecessors, the computers were for the first time accessible to the general audience.
During this time, computers served a general purpose. This was important as previously machines were used for specific purposes in specialized fields.
Fourth generation (1971-present)
The fourth generation of computers is defined by microprocessors. This technology allows thousands of integrated circuits to be built on a single silicone chip.
This advance made it possible that what once occupied an entire room could now fit in the palm of one hand.
In 1971, the Intel 4004 chip was developed that located all the computer components, from the central processing unit and memory to the input and output controls, on a single chip. This marked the beginning of the computer generation that continues to this day.
In 1981, IBM created a new computer that was capable of executing 240,000 sums per second. In 1996, Intel went further and created a machine capable of executing 400,000,000 sums per second. In 1984 Apple introduced the Macintosh with an operating system other than Windows.
Fourth-generation computers became more powerful, more compact, more reliable, and more accessible. As a result, the personal computer (PC) revolution was born.
In this generation, real-time channels, distributed operating systems, and time-sharing are used. During this period the internet was born.
Microprocessor technology is found in all modern computers. This is because the chips can be made in large quantities without costing a lot of money.
Process chips are used as central processors and memory chips are used for random access memory (RAM). Both chips make use of millions of transistors placed on their silicone surface.
These computers use fourth generation languages (4GL). These languages consist of statements similar to those made in human language.
Fifth generation (present-future)
Fifth generation devices are based on artificial intelligence. Most of these machines are still in development, but there are some applications that make use of the artificial intelligence tool. An example of this is speech recognition.
The use of parallel processing and superconductors makes artificial intelligence a reality.
In the fifth generation, the technology resulted in the production of microprocessor chips that have 10 million electronic components.
This generation is based on parallel processing hardware and artificial intelligence software. Artificial intelligence is an emerging field in computer science, which interprets the methods necessary to make computers think like human beings
Quantum computing and nano technology are expected to radically change the face of computers in the future.
The goal of fifth-generation computing is to develop devices that can respond to natural language input and are capable of learning and organizing themselves.
The idea is that the fifth generation computers of the future can understand spoken words and that they can mimic human reasoning. Ideally, these machines will be able to respond to their environment using different types of sensors.
Scientists are working on making this a reality; They try to create a computer with a real IQ with the help of advanced technology and programs. This advancement in modern technologies is going to revolutionize the computers of the future.
References
- Generation languages (2017). Recovered from computerhope.com
- The four generations of computers. Recovered from open.edu
- History of computer development and generation of computers. Recovered from wikieducator.org
- Computer- fourth generation. Recovered from tutorialspoint.com
- The five generations of computers (2010). Recovered from webopedia.com
- Generations, computers (2002). Recovered from encyclopedia.com
- Computer- fifth generation. Recovered from tutorialsonpoint.com
- Five generations of computers (2013). Recovered from bye-notes.com