Training loss: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 8: Line 8:
Machine learning employs a variety of [[loss function]]s, depending on the problem being solved and the model being employed. Some commonly employed loss functions include:
Machine learning employs a variety of [[loss function]]s, depending on the problem being solved and the model being employed. Some commonly employed loss functions include:


#Mean Squared Error (MSE) Loss: This loss function is the most commonly employed for regression problems. It measures the average squared difference between predicted output and actual output.
#[[Mean squared error]] (MSE) Loss: This loss function is the most commonly employed for [[regression]] problems. It measures the average squared difference between predicted output and actual output.
#Binary Cross-Entropy Loss: Used in binary classification problems where the objective is to accurately predict one of two possible classes, this statistic measures the difference between predicted probability of a positive class and actual binary label.
#[[Binary cross-entropy]] (BCE): Used in [[binary classification]] problems where the objective is to accurately predict one of two possible [[class]]es, this statistic measures the difference between the predicted probability of a [[positive class]] and actual [[binary label]].
#Categorical Cross-Entropy Loss: Used in multiclass classification problems to predict one of several classes, this statistic measures the difference between a predicted probability distribution and an actual one-hot encoded class label.
#[[Categorical cross-entropy]] (CCE): Used in multiclass classification problems to predict one of several classes, this statistic measures the difference between a predicted probability distribution and an actual one-hot encoded class label.
#Softmax Cross-Entropy Loss: This approach is used for multiclass classification problems with mutually exclusive classes. It calculates the categorical cross-entropy loss for each class and then takes its average across all classes.
#Softmax Cross-Entropy Loss: This approach is used for multiclass classification problems with mutually exclusive classes. It calculates the categorical cross-entropy loss for each class and then takes its average across all classes.