Term of the Moment

3DOS


Look Up Another Term


Definition: AI training passes


AI training is accomplished by computing the input data forward and backward through the neural network. The forward pass is constantly predicting the next word or part of a word in a sentence, while the backward pass adjusts the numerical weights in the neurons to correct the prediction errors from the forward pass. See AI weights and biases.

With large language models (LLMs), it can take weeks and months of forward and backward passes to fully train the model because there can be trillions of calculations that have to be made millions of times over and over.

The neural network is processed by four to eight GPUs connected to memory and to each other within a server, and there can be dozens of servers in a rack. A datacenter can have thousands of racks all connected via an optical network, and the data travel between the GPUs and memory and between all the GPUs in the system takes the most time.

In contrast, when executing the AI application for the user (inference processing), there may be a dozen to a thousand passes (not millions), which are based on the quantity of text elements (tokens) at the output side of the network. See neural network and AI training vs. infernece.