Accuracy: Difference between revisions

13 bytes added ,  28 January 2023
no edit summary
(Created page with "In machine learning, accuracy refers to the ability to accurately predict the output of a given input. It is a common metric used to evaluate the performance of a model in particular classification tasks. ==Mathematical Definition== The accuracy of a model in a classification task is the sum of all the correct predictions divided by the total number of predictions. It can be expressed mathematically as: Accuracy = Number correct predictions / Total...")
 
No edit summary
Line 4: Line 4:
The accuracy of a model in a classification task is the sum of all the correct [[prediction]]s divided by the total number of predictions. It can be expressed mathematically as:
The accuracy of a model in a classification task is the sum of all the correct [[prediction]]s divided by the total number of predictions. It can be expressed mathematically as:


Accuracy = Number correct predictions / Total number of predictions
<math>Accuracy = Number correct predictions / Total number of predictions</math>


==Example==
==Example==