Cross-entropy: Revision history

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

19 March 2023

  • curprev 19:1419:14, 19 March 2023Walle talk contribs 3,615 bytes +3,615 Created page with "{{see also|Machine learning terms}} ==Introduction== Cross-entropy is a measure of the dissimilarity between two probability distributions, commonly used in machine learning, particularly in the context of training neural networks and other classification models. It serves as a widely used loss function in optimization algorithms, where the objective is to minimize the discrepancy between the predicted distribution and the true distribution of data. In this article,..."