Interface administrators, Administrators (Semantic MediaWiki), Curators (Semantic MediaWiki), Editors (Semantic MediaWiki), Suppressors, Administrators
7,785
edits
No edit summary |
No edit summary |
||
Line 19: | Line 19: | ||
#[[Mini-batch gradient descent]]: Mini-batch gradient descent involves updating a model using a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process. This minimizes errors associated with model updating. | #[[Mini-batch gradient descent]]: Mini-batch gradient descent involves updating a model using a randomly chosen subset of training data to balance speed of convergence with stability in the optimization process. This minimizes errors associated with model updating. | ||
#[[Batch gradient descent]]: With batch gradient descent, the model is updated using all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets. | #[[Batch gradient descent]]: With batch gradient descent, the model is updated using all of the training data. This form of gradient descent offers stability but may be computationally expensive for large datasets. | ||
==Explain Like I'm 5 (ELI5)== | ==Explain Like I'm 5 (ELI5)== | ||
Line 26: | Line 24: | ||
Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible. | Machine learning relies on iteration to help computers learn from data. The computer begins with an initial guess about how to make predictions, and then updates its guess according to how well it did. This cycle continues until it gets as close to the right answer as possible. | ||
[[Category:Terms]] [[Category:Machine learning terms]] | [[Category:Terms]] [[Category:Machine learning terms]] |