Interface administrators, Administrators (Semantic MediaWiki), Curators (Semantic MediaWiki), Editors (Semantic MediaWiki), Suppressors, Administrators
7,785
edits
No edit summary |
m (Text replacement - "Category:Machine learning terms" to "Category:Machine learning terms Category:not updated") |
||
(7 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{see also|Machine learning terms}} | {{see also|Machine learning terms}} | ||
==Introduction== | ==Introduction== | ||
In [[machine learning]], [[Iteration]] is when a [[model]] [[update]]s it's [[parameters]] ([[weights]] and [[biases]]) one time during [[training]]. | In [[machine learning]], [[Iteration]] is when a [[model]] [[update]]s it's [[parameters]] ([[weights]] and [[biases]]) one time during [[training]]. The number of [example]]s the model processes in each iteration is determined by the [[hyperparameter]] [[batch size]]. If the batch size is 50, the model processes 50 examples before updating it's parameters - that is one iteration. | ||
Machine learning involves iteration, which is the process of optimizing parameters in a model to enable accurate [[prediction]]s on [[new data]]. This involves adjusting the parameters based on [[error]]s made during [[training]] on a [[training data|training]] [[dataset]]. By repeating this process multiple times, the model learns from its errors and improves its [[accuracy]]. | |||
Machine learning involves iteration, which is the process of optimizing parameters in a model to enable accurate | |||
One common application of iteration in machine learning is gradient descent, an optimization algorithm designed to find the minimum cost function. In gradient descent, model parameters are updated based on | One common application of iteration in machine learning is [[gradient descent]], an [[optimization algorithm]] designed to find the minimum [[cost function]]. In gradient descent, the model's parameters are updated iteratively based on the [[gradient]] of the [[cost function]] with respect to the parameters. | ||
== | ==What Happens in an Iteration?== | ||
In training a [[neural network]], a single iteration includes: | |||
#A forward pass to calculate the [[loss]] on a single [[batch]] of data. | |||
#A backward pass ([[backpropagation]]) to modify the network's [[parameters]] based on the loss and the rate at which it is learning ([[learning rate]]). | |||
==Types of Iterations== | |||
Machine learning often employs several types of iterations, such as: | |||
#[[Stochastic gradient descent]] (SGD): when each iteration uses only 1 [[example]] of the [[training data]]. After processing just 1 example, the model updates its weights and biases. While it is fast, SGD can be [[unstable]]. | |||
#[[Mini-batch gradient descent]]: when each iteration uses a randomly chosen subset of training data to balance speed of [[convergence]] with [[stability]] in the optimization process. | |||
#[[Batch gradient descent]]: when each iteration uses all of the training data. This form of gradient descent offers [[stability]] but may be computationally expensive for large datasets. | |||
==Explain Like I'm 5 (ELI5)== | ==Explain Like I'm 5 (ELI5)== | ||
Line 23: | Line 24: | ||
Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible. | Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible. | ||
[[Category:Terms]] [[Category:Machine learning terms]] | [[Category:Terms]] [[Category:Machine learning terms]] [[Category:not updated]] |