Skip to content

[source]

Thresholded ReLU#

A version of the ReLU function that activates only if the input is above some user-specified threshold level.

ThresholdedReLU={0if xθxif x>θ

Parameters#

# Name Default Type Description
1 threshold 1.0 float The threshold at which the neuron is activated.

Example#

use Rubix\ML\NeuralNet\ActivationFunctions\ThresholdedReLU;

$activationFunction = new ThresholdedReLU(0.5);

References#


  1. K. Konda et al. (2015). Zero-bias autoencoders and the benefits of co-adapting features. 


Last update: 2021-03-03