Skip to content

[source]

He#

The He initializer was designed to initialize parameters that feed into rectified Activation layers such as those employing ReLU, Leaky ReLU, or ELU. It draws values from a uniform distribution with limits defined as +/- (6 / (fanIn + fanOut)) ** (1. / sqrt(2)).

Parameters#

This initializer does not have any parameters.

Example#

use Rubix\ML\NeuralNet\Initializers\He;

$initializer = new He();

References#


  1. K. He et al. (2015). Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. 


Last update: 2021-03-03