Skip to content

[source]

SiLU#

Sigmoid Linear Units are smooth and non-monotonic rectified activation functions. Their inputs are weighted by the Sigmoid activation function acting as a self-gating mechanism.

Parameters#

This activation function does not have any parameters.

Example#

use Rubix\ML\NeuralNet\ActivationFunctions\SiLU;

$activationFunction = new SiLU();

References#


  1. S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning. 


Last update: 2021-11-14