Jump to content

Iteration: Difference between revisions

319 bytes added ,  25 February 2023
no edit summary
No edit summary
No edit summary
Line 7: Line 7:
One common application of iteration in machine learning is [[gradient descent]], an [[optimization algorithm]] designed to find the minimum [[cost function]]. In gradient descent, the model's parameters are updated iteratively based on the [[gradient]] of the [[cost function]] with respect to the parameters.
One common application of iteration in machine learning is [[gradient descent]], an [[optimization algorithm]] designed to find the minimum [[cost function]]. In gradient descent, the model's parameters are updated iteratively based on the [[gradient]] of the [[cost function]] with respect to the parameters.


==Types of Iterations in Machine Learning==
==What Happens in an Iteration?==
Machine learning often employs several types of iterations, such as:
In training a [[neural network]], a single iteration includes:


Stochastic Gradient Descent (SGD): SGD allows the model to be updated using a randomly chosen subset of training data instead of using the entire dataset. This enables it to make rapid progress towards minimizing its cost function; however, this may introduce noise into the optimization process.
#A forward pass to calculate the [[loss]] on a single [[batch]] of data.
#A backward pass ([[backpropagation]]) to modify the network's [[parameters]] based on the loss and the rate at which it is learning ([[learning rate]]).


Mini-batch Gradient Descent: Mini-batch gradient descent involves updating a model using a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process. This minimizes errors associated with model updating.
==Types of Iterations==
Machine learning often employs several types of iterations, such as:


Batch Gradient Descent: With batch gradient descent, the model is updated using all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets.
#[[Stochastic gradient descent]] (SGD): SGD allows the model to be updated using a randomly chosen subset of training data instead of using the entire dataset. This enables it to make rapid progress towards minimizing its cost function; however, this may introduce noise into the optimization process.
#[[Mini-batch gradient descent]]: Mini-batch gradient descent involves updating a model using a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process. This minimizes errors associated with model updating.
#[[Batch gradient descent]]: With batch gradient descent, the model is updated using all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets.


==Explain Like I'm 5 (ELI5)==Iteration is the process of doing something over and over again, such as playing a game of tag to improve at it. In machine learning, iteration refers to when a computer program keeps trying to improve its accuracy in predicting things by altering its settings each time something goes wrong (like getting tagged in tag), then trying again. Over time, this helps the program get better at making predictions with increased practice.
==Explain Like I'm 5 (ELI5)==Iteration is the process of doing something over and over again, such as playing a game of tag to improve at it. In machine learning, iteration refers to when a computer program keeps trying to improve its accuracy in predicting things by altering its settings each time something goes wrong (like getting tagged in tag), then trying again. Over time, this helps the program get better at making predictions with increased practice.