Jump to content

Accuracy: Difference between revisions

109 bytes added ,  18 February 2023
no edit summary
No edit summary
No edit summary
Line 1: Line 1:
{{see also|machine learning terms}}
==Introduction==
[[Accuracy]] in [[machine learning]] refers to a [[metric]] that measures the performance of a [[classification]] [[model]]. It measures the percentage of correct [[predictions]] made by the model on test data compared to all predictions made. Accuracy is one of the most frequently used metrics in machine learning and serves as a standard for comparing models' results.
[[Accuracy]] in [[machine learning]] refers to a [[metric]] that measures the performance of a [[classification]] [[model]]. It measures the percentage of correct [[predictions]] made by the model on test data compared to all predictions made. Accuracy is one of the most frequently used metrics in machine learning and serves as a standard for comparing models' results.


Line 29: Line 31:
==Explain Like I'm 5 (ELI5)==
==Explain Like I'm 5 (ELI5)==
Accuracy is a measure of how good a computer program is at distinguishing things. For instance, if we want it to distinguish between pictures of cats and dogs, accuracy would measure how many pictures it gets right out of all those it looks at. The higher the accuracy, the better equipped your program will be at distinguishing between them.
Accuracy is a measure of how good a computer program is at distinguishing things. For instance, if we want it to distinguish between pictures of cats and dogs, accuracy would measure how many pictures it gets right out of all those it looks at. The higher the accuracy, the better equipped your program will be at distinguishing between them.
[[Category:Terms]] [[Category:Machine learning terms]]