Look Up Another Term

Redirected from: intergrated circuit

Definition: integrated circuit


The formal name of the chip. An integrated circuit (IC) combines multiple transistors, resistors and capacitors on a single substrate. Prior to integrated circuits, these electronic components were discrete devices wired to each other on a printed circuit board.

In 1958, Texas Instruments inventor Jack Kilby demonstrated the concept. Although integrated circuits did not become commercialized until Robert Noyce of Fairchild Semiconductor developed his silicon-based device a couple months later, Kilby's circuit proved that multiple electronic devices could be constructed as a single unit. See chip, microcontroller and transistor.




The First Integrated Circuit
Demonstrated by TI in September 1958, this half-inch wide, archaic-looking collection of transistor, capacitor and two resistors mounted on a bar of germanium was the first IC. (Image courtesy of Texas Instruments, Inc.)






Seven Years Later - Three Transistors
This amplifier circuit from Siemens was mass produced in 1965. Containing three transistors and five resistors on a 1.5 square millimeter chip, it was a world of sophistication compared to Kilby's invention. (Image courtesy of Siemens AG, www.siemens.com)






A Half Century Later - 35 Billion Transistors
Xilinx's Versal chip includes multiple CPUs, RAM and an FPGA section comprising configurable circuits (see FPGA and Versal).