Softmax#
The Softmax function is a generalization of the Sigmoid function that squashes each activation between 0 and 1 with the addition that all activations add up to 1. Together, these properties allow the output of the Softmax function to be interpretable as a joint probability distribution.
Parameters#
This activation function does not have any parameters.
Example#
use Rubix\ML\NeuralNet\ActivationFunctions\Softmax;
$activationFunction = new Softmax();
Last update:
2021-01-25