Jump to content

Batch size: Difference between revisions

1,457 bytes added ,  17 February 2023
Created page with "==Introduction== Machine learning relies on a hyperparameter called batch size which indicates how many samples should be run before changing internal model parameters. This number can vary based on both machine memory capacity and the needs of each model and dataset. ==Batch Size and Gradient Descent== Gradient descent relies on batch size as a key parameter that determines how many samples are used in each iteration of the algorithm. Gradient descent works by iter..."
(Created page with "==Introduction== Machine learning relies on a hyperparameter called batch size which indicates how many samples should be run before changing internal model parameters. This number can vary based on both machine memory capacity and the needs of each model and dataset. ==Batch Size and Gradient Descent== Gradient descent relies on batch size as a key parameter that determines how many samples are used in each iteration of the algorithm. Gradient descent works by iter...")
(No difference)