A neural network comprises multiple layers of nodes that are mathematicall connected to the each other using weights and biases, collectively known as the "parameters." Randomly set at the start, they are constantly adjusted in the training phase to generate the best results. See
neural network and
AI hyperparameter.
Weights and Biases
The weights control the strength of the connections, and the bias adds a fixed value. The activation function is a formula that uses the weights and biases to compute the strength of the neuron's output.