Skip to content

[source]

Relative Entropy#

Relative Entropy (or Kullback-Leibler divergence) is a measure of how the expectation and activation of the network diverge. It is different from Cross Entropy in that it is asymmetric and thus does not qualify as a statistical measure of error.

KL(y^||y)=c=1My^clogy^cyc

Parameters#

This cost function does not have any parameters.

Example#

use Rubix\ML\NeuralNet\CostFunctions\RelativeEntropy;

$costFunction = new RelativeEntropy();

Last update: 2021-01-25