Interface administrators, Administrators (Semantic MediaWiki), Curators (Semantic MediaWiki), Editors (Semantic MediaWiki), Suppressors, Administrators
7,785
edits
No edit summary |
No edit summary |
||
Line 23: | Line 23: | ||
==Regularization== | ==Regularization== | ||
Gradient descent can also be enhanced with regularization techniques, which reduce overfitting and enhance the generalization of the model. Regularization techniques like L1 or L2 regularization add a penalty term to the cost function that penalizes large parameter values; this encourages models to use smaller parameter values while helping prevent overfitting | Gradient descent can also be enhanced with [[regularization]] techniques, which reduce overfitting and enhance the generalization of the model. Regularization techniques like [[L1 regularization|L1]] or [[L2 regularization|L2]] add a penalty term to the cost function that penalizes large parameter values; this encourages models to use smaller parameter values while helping prevent overfitting. | ||
==Explain Like I'm 5 (ELI5)== | ==Explain Like I'm 5 (ELI5)== |