Iteration

From AI Wiki
See also: Machine learning terms

Introduction

In machine learning, Iteration is when a model updates it's parameters (weights and biases) one time during training. The number of [example]]s the model processes in each iteration is determined by the hyperparameter batch size. If the batch size is 50, the model processes 50 examples before updating it's parameters - that is one iteration.

Machine learning involves iteration, which is the process of optimizing parameters in a model to enable accurate predictions on new data. This involves adjusting the parameters based on errors made during training on a training dataset. By repeating this process multiple times, the model learns from its errors and improves its accuracy.

One common application of iteration in machine learning is gradient descent, an optimization algorithm designed to find the minimum cost function. In gradient descent, the model's parameters are updated iteratively based on the gradient of the cost function with respect to the parameters.

What Happens in an Iteration?

In training a neural network, a single iteration includes:

  1. A forward pass to calculate the loss on a single batch of data.
  2. A backward pass (backpropagation) to modify the network's parameters based on the loss and the rate at which it is learning (learning rate).

Types of Iterations

Machine learning often employs several types of iterations, such as:

  1. Stochastic gradient descent (SGD): when each iteration uses only 1 example of the training data. After processing just 1 example, the model updates its weights and biases. While it is fast, SGD can be unstable.
  2. Mini-batch gradient descent: when each iteration uses a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process.
  3. Batch gradient descent: when each iteration uses all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets.

Explain Like I'm 5 (ELI5)

Iteration in machine learning is like making educated guesses to find the correct answer. Imagine playing a guessing game with friends and they tell you if your guess is too high or low; using that information, you can use it to make an even better guess the next time around. This process of making one guess and using feedback for further refinement is known as iteration.

Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible.