Proposed by Yan Le Cun in a paper in 1998, this initializer was one of the first published attempts to control the variance of activations between layers through weight initialization. It remains a good default choice for many hidden layer configurations.
This initializer does not have any parameters.
use Rubix\ML\NeuralNet\Initializers\LeCun; $initializer = new LeCun();
- Y. Le Cun et al. (1998). Efficient Backprop.