A computer dedicated to AI training and processing, the latter known as "inference." The training phase takes the most computer power, and datacenters with thousands, tens of thousands and even hundreds of thousands of GPU-based servers are used to create machine learning models; for example, see
Project Colossus.
Lots of Memory
The actual operation in both the training and inference stages of AI is massive amounts of math computations. As a result, AI servers typically have a lot of memory in both their CPUs and GPUs. Hundreds of gigabytes are not unusual. See
AI datacenter,
AI model and
AI training vs. inference.