Interface administrators, Administrators (Semantic MediaWiki), Curators (Semantic MediaWiki), Editors (Semantic MediaWiki), Suppressors, Administrators
7,785
edits
No edit summary |
|||
Line 16: | Line 16: | ||
Machine learning often employs several types of iterations, such as: | Machine learning often employs several types of iterations, such as: | ||
#[[Stochastic gradient descent]] (SGD): | #[[Stochastic gradient descent]] (SGD): when each iteration uses only 1 [[example]] of the [[training data]]. After processing just 1 example, the model updates its weights and biases. | ||
#[[Mini-batch gradient descent]]: | #[[Mini-batch gradient descent]]: when each iteration uses a randomly chosen subset of training data to balance speed of [[convergence]] with [[stability]] in the optimization process. | ||
#[[Batch gradient descent]]: | #[[Batch gradient descent]]: when each iteration uses all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets. | ||
==Explain Like I'm 5 (ELI5)== | ==Explain Like I'm 5 (ELI5)== |