Stability: Difference between revisions

m
 
(One intermediate revision by the same user not shown)
Line 14: Line 14:


#[[Cross-validation]]: This method is commonly used for evaluating model stability. In this approach, data is divided into multiple [[fold]]s and the model trained and assessed on each fold. The average performance across all folds then serves to judge how stable the model truly is.
#[[Cross-validation]]: This method is commonly used for evaluating model stability. In this approach, data is divided into multiple [[fold]]s and the model trained and assessed on each fold. The average performance across all folds then serves to judge how stable the model truly is.
#[[Bootstrapping]]: This resampling technique involves drawing multiple examples with replacement from the training data to generate multiple [[training set[[s. The model is then trained on each set and its average performance used to assess its stability.
#[[Bootstrapping]]: This resampling technique involves drawing multiple examples with replacement from the training data to generate multiple [[training set]]s. The model is then trained on each set and its average performance used to assess its stability.
#[[Regularization]]: This technique helps control [[overfitting]] in a model by adding a penalty term to the loss function. Regularization helps improve model stability by preventing it from fitting [[noise]] in data.
#[[Regularization]]: This technique helps control [[overfitting]] in a model by adding a penalty term to the loss function. Regularization helps improve model stability by preventing it from fitting [[noise]] in data.


Line 20: Line 20:
Stability is a model's capacity to remain accurate even when small changes are made to its training data or construction. Think of it like building a tower with blocks; if it is constructed poorly, even minor winds could bring it down; on the other hand, if built robustly and securely, even strong winds won't knock it over. We can make our tower stronger using techniques like cross-validation, bootstrapping, and regularization.
Stability is a model's capacity to remain accurate even when small changes are made to its training data or construction. Think of it like building a tower with blocks; if it is constructed poorly, even minor winds could bring it down; on the other hand, if built robustly and securely, even strong winds won't knock it over. We can make our tower stronger using techniques like cross-validation, bootstrapping, and regularization.


[[Category:Terms]] [[Category:Machine learning terms]]
[[Category:Terms]] [[Category:Machine learning terms]] [[Category:not updated]]