The He initializer was designed for hidden layers that feed into rectified linear unit layers such as ReLU, Leaky ReLU, and ELU. It draws from a uniform distribution with limits defined as +/- (6 / (fanIn + fanOut)) ** (1. / sqrt(2)).
This initializer does not have any parameters.
use Rubix\ML\NeuralNet\Initializers\He; $initializer = new He();
- K. He et al. (2015). Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.