[source]

ReLU#

Rectified Linear Units (ReLU) only output the positive signal of the input. They have the benefit of having a monotonic derivative and are cheap to compute.

Parameters#

# Param Default Type Description
1 threshold 0. float The input value necessary to trigger an activation.

Example#

use Rubix\ML\NeuralNet\ActivationFunctions\ReLU;

$activationFunction = new ReLU(0.1);

References#

  • A. L. Maas et al. (2013). Rectifier Nonlinearities Improve Neural Network Acoustic Models.
  • K. Konda et al. (2015). Zero-bias Autoencoders and the Benefits of Co-adapting Features.