Term of the Moment

hybrid


Look Up Another Term


Definition: normal computing


(1) Regular computing; traditional computers; routine operations.

(2) (NORMAL COMPUTING CORP., New York) Founded in 2022 by former Google employees who helped pioneer AI, NORMAL makes the hardware and accompanying software to generate images using a thousand times less power.

The primary way AI generates images is the diffusion method, which turns real images into random noise that is eventually stripped away. The noise creates the endless possibilities. A type of thermodynamic computing, NORMAL chips use the natural fluctuations of heat molecules to generate the random noise. Also called a "stochastic" chip, which means random.

For Inference Image Generation
The NORMAL chip works only for inference, not training. The chip supplies the randomness needed at image generation to enable multiple outcomes, otherwise every image would be the same.

The company's goal is for datacenters to mix and match hardware, and by deploying NORMAL chips, the electricity cost for image generation is dramatically lower than using GPUs, the primary AI chip. See GPU, inference and inference engine.




The Image Generator
In 2025, this NORMAL "thermodynamic" computer chip uses hardware-based randomness to create images. Obviously, the company hopes this revolutionary chip will become "Normal." See stochastic computing and AI datacenter.