Skip to content

[source]

ELU#

Exponential Linear Units are a type of rectifier that soften the transition from non-activated to activated using the exponential function. As such, ELU produces smoother gradients than the piecewise linear ReLU function.

ELU={α(ex1)if x0xif x>0

Parameters#

# Name Default Type Description
1 alpha 1.0 float The value at which leakage will begin to saturate. Ex. alpha = 1.0 means that the output will never be less than -1.0 when inactivated.

Example#

use Rubix\ML\NeuralNet\ActivationFunctions\ELU;

$activationFunction = new ELU(2.5);

References#


  1. D. A. Clevert et al. (2016). Fast and Accurate Deep Network Learning by Exponential Linear Units. 


Last update: 2021-03-03