Term of the Moment

Computer History Museum


Look Up Another Term


Definition: AI scaling


The common approach to building more comprehensive AI models is to use more data examples, more GPU chips and larger datacenters. OpenAI co-founder Ilya Sutskever, a central figure in AI, claims we are reaching diminishing returns with scaling and that more research should be done to find new ways of building models from scratch. See AI secret sauce and AI glossary.

Sutskever does not deny that more scaling is bringing more advanced systems; however, he says that the keys to future success will depend on more advanced training methods, not just more training, and more efficient algorithms, not just more passes through the data. See AI training vs. inference and neural network.