Source

He#

The He initializer was designed for hidden layers that feed into rectified linear unit layers such as ReLU, Leaky ReLU, and ELU. It draws from a uniform distribution with limits defined as +/- (6 / (fanIn + fanOut)) ** (1. / sqrt(2)).

Parameters#

This initializer does not have any parameters.

Example#

use Rubix\ML\NeuralNet\Initializers\He;

$initializer = new He();

References#

  • K. He et al. (2015). Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.