Soft Plus#
A smooth approximation of the piecewise linear ReLU activation function.
Parameters#
This activation function does not have any parameters.
Example#
use Rubix\ML\NeuralNet\ActivationFunctions\SoftPlus;
$activationFunction = new SoftPlus();
References#
-
X. Glorot et al. (2011). Deep Sparse Rectifier Neural Networks. ↩
Last update:
2021-03-03