Xavier 1#
The Xavier 1 initializer draws from a uniform distribution [-limit, limit] where limit is equal to sqrt(6 / (fanIn + fanOut)). This initializer is best suited for layers that feed into an activation layer that outputs a value between 0 and 1 such as Softmax or Sigmoid.
Parameters#
This initializer does not have any parameters.
Example#
use Rubix\ML\NeuralNet\Initializers\Xavier1;
$initializer = new Xavier1();
References#
-
X. Glorot et al. (2010). Understanding the Difficulty of Training Deep Feedforward Neural Networks. ↩
Last update:
2021-03-03