Skip to content

[source]

Soft Plus#

A smooth approximation of the piecewise linear ReLU activation function.

\[ {\displaystyle Soft-Plus = \log \left(1+e^{x}\right)} \]

Parameters#

This activation function does not have any parameters.

Example#

use Rubix\ML\NeuralNet\ActivationFunctions\SoftPlus;

$activationFunction = new SoftPlus();

References#


  1. X. Glorot et al. (2011). Deep Sparse Rectifier Neural Networks.