Skip to content

[source]

Cross Entropy#

Cross Entropy (or log loss) measures the performance of a classification model whose output is a joint probability distribution over the possible classes. Entropy increases as the predicted probability distribution diverges from the actual distribution.

CrossEntropy=c=1Myo,clog(po,c)

Parameters#

This cost function does not have any parameters.

Example#

use Rubix\ML\NeuralNet\CostFunctions\CrossEntropy;

$costFunction = new CrossEntropy();

Last update: 2021-01-25