Source

Softmax Classifier#

A generalization of Logistic Regression for multiclass problems using a single layer neural network with a Softmax output layer.

Interfaces: Estimator, Learner, Online, Probabilistic, Verbose, Persistable

Data Type Compatibility: Continuous

Parameters#

# Param Default Type Description
1 batch size 100 int The number of training samples to process at a time.
2 optimizer Adam object The gradient descent optimizer used to train the underlying network.
3 alpha 1e-4 float The amount of L2 regularization to apply to the weights of the network.
4 epochs 1000 int The maximum number of training epochs to execute.
5 min change 1e-4 float The minimum change in the cost function necessary to continue training.
6 cost fn Cross Entropy object The function that computes the cost of an erroneous activation during training.
7 window 5 int The number of epochs without improvement in the training loss to wait before considering an early stop.

Additional Methods#

Return the average loss of a sample at each epoch of training:

public steps() : array

Return the underlying neural network instance or null if untrained:

public network() : Network|null

Example#

use Rubix\ML\Classifiers\SoftmaxClassifier;
use Rubix\ML\NeuralNet\Optimizers\Momentum;
use Rubix\ML\NeuralNet\CostFunctions\CrossEntropy;

$estimator = new SoftmaxClassifier(256, new Momentum(0.001), 1e-4, 300, 1e-4, new CrossEntropy(), 10);