The Xavier 1 initializer draws from a uniform distribution [-limit, limit] where limit is equal to sqrt(6 / (fanIn + fanOut)). This initializer is best suited for layers that feed into an activation layer that outputs a value between 0 and 1 such as Softmax or Sigmoid.
This initializer does not have any parameters.
use Rubix\ML\NeuralNet\Initializers\Xavier1; $initializer = new Xavier1();
- X. Glorot et al. (2010). Understanding the Difficulty of Training Deep Feedforward Neural Networks.