Activation#
Activation layers apply a user-defined non-linear activation function to their inputs. They often work in conjunction with Dense layers as a way to transform their output.
Parameters#
# | Name | Default | Type | Description |
---|---|---|---|---|
1 | activationFn | ActivationFunction | The function that computes the output of the layer. |
Example#
use Rubix\ML\NeuralNet\Layers\Activation;
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU;
$layer = new Activation(new ReLU());
Last update:
2021-01-25