Iteration: Difference between revisions

229 bytes removed ,  25 February 2023
No edit summary
Line 16: Line 16:
Machine learning often employs several types of iterations, such as:
Machine learning often employs several types of iterations, such as:


#[[Stochastic gradient descent]] (SGD): SGD allows the model to be updated using a randomly chosen subset of training data instead of using the entire dataset. This enables it to make rapid progress towards minimizing its cost function; however, this may introduce noise into the optimization process.
#[[Stochastic gradient descent]] (SGD): when each iteration uses only 1 [[example]] of the [[training data]]. After processing just 1 example, the model updates its weights and biases.
#[[Mini-batch gradient descent]]: Mini-batch gradient descent involves updating a model using a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process. This minimizes errors associated with model updating.
#[[Mini-batch gradient descent]]: when each iteration uses a randomly chosen subset of training data to balance speed of [[convergence]] with [[stability]] in the optimization process.
#[[Batch gradient descent]]: With batch gradient descent, the model is updated using all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets.
#[[Batch gradient descent]]: when each iteration uses all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets.


==Explain Like I'm 5 (ELI5)==
==Explain Like I'm 5 (ELI5)==