Definition
In a neural network, weights are the numerical values associated with connections between neurons that determine the importance of each input. During training, weights are continuously adjusted to minimize the model's error. A model's 'weights' are essentially its learned knowledge and are saved in files that can be gigabytes in size.
Related terms
EXPLORE