Iteration: Difference between revisions
No edit summary |
m (Text replacement - "Category:Machine learning terms" to "Category:Machine learning terms Category:not updated") |
||
(5 intermediate revisions by the same user not shown) | |||
Line 7: | Line 7: | ||
One common application of iteration in machine learning is [[gradient descent]], an [[optimization algorithm]] designed to find the minimum [[cost function]]. In gradient descent, the model's parameters are updated iteratively based on the [[gradient]] of the [[cost function]] with respect to the parameters. | One common application of iteration in machine learning is [[gradient descent]], an [[optimization algorithm]] designed to find the minimum [[cost function]]. In gradient descent, the model's parameters are updated iteratively based on the [[gradient]] of the [[cost function]] with respect to the parameters. | ||
== | ==What Happens in an Iteration?== | ||
In training a [[neural network]], a single iteration includes: | |||
#A forward pass to calculate the [[loss]] on a single [[batch]] of data. | |||
#A backward pass ([[backpropagation]]) to modify the network's [[parameters]] based on the loss and the rate at which it is learning ([[learning rate]]). | |||
==Types of Iterations== | |||
Machine learning often employs several types of iterations, such as: | |||
#[[Stochastic gradient descent]] (SGD): when each iteration uses only 1 [[example]] of the [[training data]]. After processing just 1 example, the model updates its weights and biases. While it is fast, SGD can be [[unstable]]. | |||
#[[Mini-batch gradient descent]]: when each iteration uses a randomly chosen subset of training data to balance speed of [[convergence]] with [[stability]] in the optimization process. | |||
#[[Batch gradient descent]]: when each iteration uses all of the training data. This form of gradient descent offers [[stability]] but may be computationally expensive for large datasets. | |||
==Explain Like I'm 5 (ELI5)== | ==Explain Like I'm 5 (ELI5)== | ||
Line 22: | Line 24: | ||
Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible. | Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible. | ||
[[Category:Terms]] [[Category:Machine learning terms]] | [[Category:Terms]] [[Category:Machine learning terms]] [[Category:not updated]] |
Latest revision as of 21:23, 17 March 2023
- See also: Machine learning terms
Introduction
In machine learning, Iteration is when a model updates it's parameters (weights and biases) one time during training. The number of [example]]s the model processes in each iteration is determined by the hyperparameter batch size. If the batch size is 50, the model processes 50 examples before updating it's parameters - that is one iteration.
Machine learning involves iteration, which is the process of optimizing parameters in a model to enable accurate predictions on new data. This involves adjusting the parameters based on errors made during training on a training dataset. By repeating this process multiple times, the model learns from its errors and improves its accuracy.
One common application of iteration in machine learning is gradient descent, an optimization algorithm designed to find the minimum cost function. In gradient descent, the model's parameters are updated iteratively based on the gradient of the cost function with respect to the parameters.
What Happens in an Iteration?
In training a neural network, a single iteration includes:
- A forward pass to calculate the loss on a single batch of data.
- A backward pass (backpropagation) to modify the network's parameters based on the loss and the rate at which it is learning (learning rate).
Types of Iterations
Machine learning often employs several types of iterations, such as:
- Stochastic gradient descent (SGD): when each iteration uses only 1 example of the training data. After processing just 1 example, the model updates its weights and biases. While it is fast, SGD can be unstable.
- Mini-batch gradient descent: when each iteration uses a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process.
- Batch gradient descent: when each iteration uses all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets.
Explain Like I'm 5 (ELI5)
Iteration in machine learning is like making educated guesses to find the correct answer. Imagine playing a guessing game with friends and they tell you if your guess is too high or low; using that information, you can use it to make an even better guess the next time around. This process of making one guess and using feedback for further refinement is known as iteration.
Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible.