Look Up Another Term

Redirected from: one-chip computer

Definition: microcontroller


A single chip that contains the processor (CPU), non-volatile memory (flash memory or ROM) for the program, volatile memory (RAM) for processing the data, a clock and an I/O control unit. Microcontroller units (MCUs) are available in numerous sizes and architectures. See CPU, flash memory, ROM, RAM and clock.

They Don't Get the Publicity
Because MCUs contain only 8-, 16- or 32-bit CPUs and cost just a few dollars or even less than one dollar, they do not get the mainstream attention as do the latest 64-bit chips in a PC or graphics card, which cost several hundred dollars. MCUs also do not require the state-of-the-art chip technology (see process technology).

However, MCUs are everywhere, and billions of these "computers on a chip" are embedded in products from toys to appliances to just about anything. New cars can employ a couple hundred of them. For example, an entire MCU might be dedicated to a simple task such as waiting for the driver to close the car door or press a particular button on the dashboard. See embedded system and automotive systems.






Motorola 6801 - One of the First
Introduced in 1978, the 6801 was one of the first semiconductor products to claim the "computer on a chip" moniker. These magnified images show the entire chip (top), about three quarters of the 256 bytes of RAM (left) and only a few bytes at 400x.






They Don't Get Much Smaller
These 8-bit PIC brand microcontrollers from Microchip are used in myriad applications, cost less than 50 cents each and are much more powerful than the 6801. We're not great technology predictors. In 1949, Popular Mechanics speculated that future computers would only weigh "one and a half tons!"






A Microcontroller Behind Everything
Today's cars can have more than a hundred MCUs, each one controlling the simplest function from pressing a button to more complicated systems like the ones in the Honda above.