ReLU#
Rectified Linear Units (ReLU) only output the positive signal of the input. They have the benefit of having a monotonic derivative and are cheap to compute.
\[
{\displaystyle ReLU = {\begin{aligned}&{\begin{cases}0&{\text{if }}x\leq 0\\x&{\text{if }}x>0\end{cases}}=&\max\{0,x\}\end{aligned}}}
\]
Parameters#
This activation function does not have any parameters.
Example#
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU;
$activationFunction = new ReLU(0.1);
References#
Last update:
2021-03-03