Term of the Moment

hidden file extensions


Look Up Another Term


Definition: computer generations


Following is a brief summary of the generations of computers based on their hardware and software architecture. See computer revolution.

First Generation
In the late 1940s and early 1950s (EDSAC, UNIVAC I, etc.) computers used vacuum tubes for their digital logic and liquid mercury memories for storage. See early memory, EDSAC and UNIVAC I.

Second Generation
In the late 1950s, transistors replaced tubes and used magnetic cores for memories (IBM 1401, Honeywell 800). Size was reduced and reliability was significantly improved. See IBM 1401 and Honeywell.

Third Generation
In the mid-1960s, computers used the first integrated circuits (IBM 360, CDC 6400) and the first operating systems and database management systems. Although most processing was still batch oriented using punch cards and magnetic tapes, online systems were being developed. This was the era of mainframes and minicomputers, essentially large centralized computers and small departmental computers. See punch card, System/360 and Control Data.

Fourth Generation
The mid to late-1970s spawned the microprocessor and personal computer, introducing distributed processing and office automation. Word processing, query languages, report writers and spreadsheets put large numbers of people in touch with the computer for the first time. See query language and report writer.

Fifth Generation - Now
The 21st century ushered in the fifth generation, which has dramatically changed people's behavior. The advent of the smartphone in the 2010s was the beginning, combined with the greater penetration of the Internet around the globe, means that anybody from anywhere is capable of controlling anything on the planet.

The fifth generation is making dramatic changes as it increasingly delivers various forms of artificial intelligence (AI). More sophisticated search and natural language recognition are features that users recognize, but software that emulates human functions is changing just about everything. See AI, ChatGPT, machine learning, deep learning, neural network, computer vision, virtual assistant and natural language recognition.




The Beginning of Commercial Computing
In 1951, the UNIVAC I ushered in the computer age. This installation in Frankfurt, Germany in 1956 shows half the CPU (top left and below).







Then and Now
Imagine watching this delivery and someone says, "everything on the ramp will fit on your fingertip some day." See computer prices.