Metrics#

Validation metrics are for evaluating the performance of an Estimator. They output a score based on the predictions and the ground-truth labels.

Scoring Predictions#

To compute a validation score, pass in the predictions from an estimator along with the expected labels:

public score(array $predictions, array $labels) : float

Example

use Rubix\ML\CrossValidation\Metrics\MeanAbsoluteError;

// Train an estimator and make predictions

$metric = new MeanAbsoluteError();

$score = $metric->score($predictions, $labels);

var_dump($score);

Output

float(-0.99846070553066)

Note: Regression metrics output the negative of their value to maintain the notion that cross validation scores should be maximized instead of minimized such as the case with loss functions.

Output Range#

Output the range of values the validation score can take on in a 2-tuple:

public range() : array

Example

[$min, $max] = $metric->range();

var_dump($min);
var_dump($max);

Output

float(-INF)

int(0)

Compatibility#

Return a list of integer-encoded estimator types that the metric is compatible with:

public compatibility() : array

Example

var_dump($metric->compatibility());

Output

array(1) {
  [0]=> int(2)
}