Softsign#
A smooth sigmoid-shaped function that squashes the input between -1 and 1.
Parameters#
This activation function does not have any parameters.
Example#
use Rubix\ML\NeuralNet\ActivationFunctions\Softsign;
$activationFunction = new Softsign();
References#
-
X. Glorot et al. (2010). Understanding the Difficulty of Training Deep Feedforward Neural Networks. ↩
Last update:
2021-03-03