Source

Softmax#

The Softmax function is a generalization of the Sigmoid function that squashes each activation between 0 and 1 and all activations add up to exactly 1. Together, these properties allow the output of the Softmax function to be interpretable as a joint probability distribution.

Parameters#

This activation function does not have any parameters.

Example#

use Rubix\ML\NeuralNet\ActivationFunctions\Softmax;

$activationFunction = new Softmax();