Term of the Moment

default folder


Look Up Another Term


Redirected from: gradient descent

Definition: AI cost function


The difference between predicted and expected values when training a machine learning model. The "gradient descent" is an example of an optimization algorithm used to derive the smallest difference in the cost function, which is the goal. This is achieved by using "backpropagation," which means keep going back to the beginning and readjusting all the parameters between layers in the network (see neural network). See AI weights and biases and large language model.