sphamster / classification-metrics
PHP package to compute confusion matrices and classification metrics
v1.0.0
2025-07-19 00:43 UTC
Requires
- php: ^8.1
Requires (Dev)
- fakerphp/faker: ^1.24
- laravel/pint: ^1.24
- pestphp/pest: ^3.8
- phpstan/phpstan: ^2.1
- phpunit/php-code-coverage: ^11.0
- rector/rector: ^2.1
- xrdebug/php: ^3.0
This package is auto-updated.
Last update: 2025-07-19 01:50:07 UTC
README
A PHP package for computing confusion matrices and classification metrics for machine learning models.
Installation
You can install the package via composer:
composer require sphamster/classification-metrics
Requirements
- PHP 8.1 or higher
Usage
Creating a Confusion Matrix
You can create a confusion matrix directly from predictions:
use Sphamster\ClassificationMetrics\ConfusionMatrix; // Your ground truth labels $true_labels = ['cat', 'dog', 'cat', 'bird', 'dog', 'bird']; // Your model's predictions $predicted_labels = ['cat', 'dog', 'dog', 'bird', 'cat', 'bird']; // Optional: specify the order of labels (if omitted, will use unique values from true_labels) $labels = ['cat', 'dog', 'bird']; // Create the confusion matrix $confusion_matrix = ConfusionMatrix::fromPredictions($true_labels, $predicted_labels, $labels);
Or you can create it directly from a matrix:
$labels = ['cat', 'dog', 'bird']; $matrix = [ [5, 1, 0], // cat: 5 correct, 1 as dog, 0 as bird [2, 8, 1], // dog: 2 as cat, 8 correct, 1 as bird [0, 0, 6] // bird: 0 as cat, 0 as dog, 6 correct ]; $confusion_matrix = new ConfusionMatrix($labels, $matrix);
Extracting Basic Metrics
The confusion matrix provides methods to extract basic metrics:
// Get true positives for all classes $tp = $confusion_matrix->truePositives(); // Or for a specific class $tp_cat = $confusion_matrix->truePositives('cat'); // Similarly for false positives, false negatives, and true negatives $fp = $confusion_matrix->falsePositives(); $fn = $confusion_matrix->falseNegatives(); $tn = $confusion_matrix->trueNegatives();
Computing Classification Metrics
The package provides implementations for common classification metrics:
Precision
use Sphamster\ClassificationMetrics\Metrics\Precision; use Sphamster\ClassificationMetrics\Enums\AverageStrategy; // Get precision for each class $precision = new Precision(); $class_precision = $precision->measure($confusion_matrix); // Get macro-averaged precision $macro_precision = (new Precision(AverageStrategy::MACRO))->measure($confusion_matrix); // Get micro-averaged precision $micro_precision = (new Precision(AverageStrategy::MICRO))->measure($confusion_matrix); // Get weighted-averaged precision $weighted_precision = (new Precision(AverageStrategy::WEIGHTED))->measure($confusion_matrix);
Recall
use Sphamster\ClassificationMetrics\Metrics\Recall; // Get recall for each class $recall = new Recall(); $class_recall = $recall->measure($confusion_matrix); // Similarly, you can use AverageStrategy for macro, micro, and weighted averaging
F1 Score
use Sphamster\ClassificationMetrics\Metrics\F1Score; // Get F1 score for each class $f1 = new F1Score(); $class_f1 = $f1->measure($confusion_matrix); // Similarly, you can use AverageStrategy for macro, micro, and weighted averaging
Averaging Strategies
The package supports three averaging strategies for multi-class metrics:
- Macro: Calculate metrics for each label and find their unweighted mean. This does not take label imbalance into account.
- Micro: Calculate metrics globally by counting the total true positives, false negatives, and false positives.
- Weighted: Calculate metrics for each label and find their average weighted by support (the number of true instances for each label).
Testing
composer test
Code Quality
The package includes tools for maintaining code quality:
# Run code style fixer composer lint # Run static analysis composer test:types # Run refactoring tool composer refactor # Run all checks composer test
License
The MIT License (MIT). Please see License File for more information.