Jump to content

Training loss: Difference between revisions

No edit summary
Line 15: Line 15:


==How Training Loss is Used==
==How Training Loss is Used==
The training loss is used to evaluate the performance of a machine learning model during training. To minimize this loss, optimization algorithms such as stochastic gradient descent (SGD) or Adam are employed. These optimization processes modify the model's weights in order to minimize its training loss.
The training loss is used to evaluate the performance of a machine learning model during training. To minimize this loss, [[optimization algorithm]]s such as [[stochastic gradient descent]] (SGD) or [[Adam]] are employed. These optimization processes modify the model's [[parameters]] in order to minimize its training loss.


==Overfitting and Underfitting==
==Overfitting and Underfitting==