Skip to content

[source]

PReLU#

Parametric Rectified Linear Units are leaky rectifiers whose leakage coefficient is learned during training. Unlike standard Leaky ReLUs whose leakage remains constant, PReLU layers can adjust the leakage to better suite the model on a per node basis.

PReLU={αxif x<0xif x0

Parameters#

# Name Default Type Description
1 initializer Constant Initializer The initializer of the leakage parameter.

Example#

use Rubix\ML\NeuralNet\Layers\PReLU;
use Rubix\ML\NeuralNet\Initializers\Normal;

$layer = new PReLU(new Normal(0.5));

References#


  1. K. He et al. (2015). Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. 


Last update: 2021-03-03