Term of the Moment

block producer


Look Up Another Term


Definition: AI datacenter


A datacenter dedicated to AI training and processing (inference). The top companies developing AI (xAI, Microsoft, Google, etc.) have deployed server clusters with thousands of GPUs for training large language models (LLMs). See GPU and Blackwell.

The deep learning phase uses enormous amounts of computer power and electricity, and datacenters are built precisely for that purpose. A server rack in a traditional datacenter uses from five to 10 kilowatts of power; however, AI datacenters generally require 60 kilowatts or more per GPU server rack, and there can be a thousand or more racks in total. For example, when all phases are in operation, Elon Musk's Memphis AI datacenter will use 200 megawatts of electricity (see Project Colossus).

Fifty Times More Energy From 2025 to 2030
It is estimated that by 2030 the energy required for AI will be fifty times higher than 2025, and thousands of datacenters are being built around the world to accommodate this requirement. In the U.S., areas in Wyoming, Indiana, Iowa, Texas, Oregon and Washington State are new locations. Because of its reliable power supply, Northern Virginia has experienced enormous datacenter growth.

Nuclear Reactors
A potential source of electricity to meet this massive requirement is nuclear energy in the form of traditional reactors as well as their small modular counterparts, a fast-growing industry (see small modular reactor). Pundits still have faith that, in time, nuclear fusion will save the day, not only for AI, but for the entire world (see nuclear fusion).