Gradient descent: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 1: Line 1:
{{see also|Machine learning terms}}
{{see also|Machine learning terms}}
==Introduction==
==Introduction==
Gradient descent is a popular optimization algorithm in machine learning. It works by finding the minimum cost function, which can be adjusted to minimize errors between predicted output and actual output from a model. Gradient descent utilizes weights and biases as input parameters to achieve this minimal error margin.
Gradient descent is a popular optimization algorithm in machine learning. Its goal is to minimize the [[loss]] of the model during [[training]]. Gradient descent utilizes weights and biases as input parameters to achieve this minimal error margin.


==How Gradient Descent Works==
==How Gradient Descent Works==