ReLU#
Rectified Linear Units (ReLU) only output the positive signal of the input. They have the benefit of having a monotonic derivative and are cheap to compute.
Parameters#
This activation function does not have any parameters.
Example#
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU;
$activationFunction = new ReLU(0.1);
References#
Last update:
2021-03-03