Skip to content

Latest commit

 

History

History
18 lines (13 loc) · 648 Bytes

cross-entropy.md

File metadata and controls

18 lines (13 loc) · 648 Bytes

[source]

Cross Entropy

Cross Entropy (or log loss) measures the performance of a classification model whose output is a joint probability distribution over the possible classes. Entropy increases as the predicted probability distribution diverges from the actual distribution.

$$ Cross Entropy = -\sum_{c=1}^My_{o,c}\log(p_{o,c}) $$

Parameters

This cost function does not have any parameters.

Example

use Rubix\ML\NeuralNet\CostFunctions\CrossEntropy;

$costFunction = new CrossEntropy();