Rectified Linear Units (ReLU) only output the positive signal of the input. They have the benefit of having a monotonic derivative and are cheap to compute.
This activation function does not have any parameters.
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU; $activationFunction = new ReLU(0.1);
- A. L. Maas et al. (2013). Rectifier Nonlinearities Improve Neural Network Acoustic Models.
- K. Konda et al. (2015). Zero-bias Autoencoders and the Benefits of Co-adapting Features.