gowelle / azure-moderator
Azure Content Moderator wrapper for Laravel
Installs: 173
Dependents: 0
Suggesters: 0
Security: 0
Stars: 0
Watchers: 1
Forks: 0
Open Issues: 0
pkg:composer/gowelle/azure-moderator
Requires
- php: ^8.2
- guzzlehttp/guzzle: ^7.9
- illuminate/support: ^10.0 || ^11.0 || ^12.0
Requires (Dev)
- infection/infection: ^0.27
- laravel/pint: ^1.0
- mockery/mockery: ^1.6
- orchestra/testbench: ^8.0
- pestphp/pest: ^2.0
- pestphp/pest-plugin-laravel: ^2.0
- phpstan/phpstan: ^1.10
- spatie/laravel-package-tools: ^1.0
README
A Laravel package for content moderation using Azure Content Safety API. This package helps you analyze both text and image content for potentially harmful material, automatically flagging or approving content based on Azure's AI-powered analysis.
Table of Contents
- Features
- Requirements
- Installation
- Configuration
- Usage
- Testing
- Documentation
- Changelog
- Contributing
- Security
- Credits
- License
Features
- Easy integration with Azure Content Safety API
- Text and Image content moderation
- Multimodal Analysis (Preview) - Combined text + image analysis
- Multi-Modal Analysis (Batch & Async)
- Custom Blocklist Management
- Protected Material Detection
- Strongly-typed DTO responses (ModerationResult, CategoryAnalysis, MultimodalResult)
- Automatic content analysis and flagging
- Configurable severity thresholds
- User rating support (for text moderation)
- Laravel validation rules for text and images
- Artisan commands for testing & management
- Retry handling for API failures
- Comprehensive test suite (90+ tests)
- Integration tests with real Azure API
- PHPStan level 6 static analysis
- Performance benchmarks
- Laravel-native configuration
- Extensive logging
Requirements
- PHP 8.2 or higher
- Laravel 10.0 or higher
- Azure Content Safety API subscription
Installation
Install the package via composer:
composer require gowelle/azure-moderator
Publish the configuration file:
php artisan vendor:publish --provider="Gowelle\AzureModerator\AzureContentSafetyServiceProvider"
Configuration
Add your Azure credentials to your .env file:
AZURE_CONTENT_SAFETY_ENDPOINT=your-endpoint AZURE_CONTENT_SAFETY_API_KEY=your-api-key AZURE_CONTENT_SAFETY_LOW_RATING_THRESHOLD=2 AZURE_CONTENT_SAFETY_HIGH_SEVERITY_THRESHOLD=6 AZURE_MODERATOR_FAIL_ON_ERROR=false
Configuration Options
AZURE_CONTENT_SAFETY_ENDPOINT: Your Azure Content Safety API endpoint URLAZURE_CONTENT_SAFETY_API_KEY: Your Azure API key (keep this secure!)AZURE_CONTENT_SAFETY_LOW_RATING_THRESHOLD: Minimum rating to approve text content (0-5, default: 2)AZURE_CONTENT_SAFETY_HIGH_SEVERITY_THRESHOLD: Minimum severity to flag content (0-7, default: 3)AZURE_MODERATOR_FAIL_ON_ERROR: Whether validation should fail when API is unavailable (default: false)
Usage
Text Moderation
Basic Usage
use Gowelle\AzureModerator\Facades\AzureModerator; // Moderate content - returns ModerationResult DTO $result = AzureModerator::moderate('Some text content', 4.5); // Check result using DTO methods if ($result->isApproved()) { // Content is safe } else { // Content was flagged $reason = $result->reason; }
Custom Categories
use Gowelle\AzureModerator\Enums\ContentCategory; $result = AzureModerator::moderate( text: 'Some text content', rating: 4.5, categories: [ ContentCategory::HATE->value, ContentCategory::VIOLENCE->value ] );
Image Moderation
Basic Image Moderation
use Gowelle\AzureModerator\Facades\AzureModerator; // Moderate image by URL - returns ModerationResult DTO $result = AzureModerator::moderateImage('https://example.com/image.jpg'); // Check result using DTO methods if ($result->isApproved()) { // Image is safe } else { // Image was flagged $reason = $result->reason; $scores = $result->categoriesAnalysis; // Array of CategoryAnalysis DTOs }
Base64 Image Moderation
// Moderate uploaded image $imageData = file_get_contents($uploadedFile->getRealPath()); $base64Image = base64_encode($imageData); $result = AzureModerator::moderateImage( image: $base64Image, encoding: 'base64' );
Note: Base64 images are limited to 4MB of encoded data, which corresponds to approximately 3MB of original image size (due to base64 encoding overhead of ~33%).
Image Moderation with Custom Categories
use Gowelle\AzureModerator\Enums\ContentCategory; $result = AzureModerator::moderateImage( image: 'https://example.com/image.jpg', categories: [ ContentCategory::SEXUAL->value, ContentCategory::VIOLENCE->value ] );
Laravel Validation
Use the SafeImage validation rule to automatically validate uploaded images:
use Gowelle\AzureModerator\Rules\SafeImage; // In your form request or controller $request->validate([ 'avatar' => ['required', 'image', 'max:2048', new SafeImage()], ]); // With custom categories $request->validate([ 'profile_picture' => [ 'required', 'image', new SafeImage([ ContentCategory::SEXUAL->value, ContentCategory::VIOLENCE->value ]) ], ]);
Error Handling
The package provides flexible error handling to ensure both security and user experience:
use Gowelle\AzureModerator\Exceptions\ModerationException; try { $result = AzureModerator::moderate('Some text content', 4.5); } catch (ModerationException $e) { // Handle API errors (only thrown for input validation errors in moderate()) Log::error('Moderation failed', [ 'message' => $e->getMessage(), 'endpoint' => $e->endpoint, 'status' => $e->statusCode ]); }
Graceful Degradation and Strict Mode
The fail_on_api_error configuration controls how the package behaves when the Azure API is unavailable:
Default Behavior (fail_on_api_error = false):
- When the Azure API fails or is unavailable, both
moderate()andmoderateImage()return approved status - The
SafeImagevalidation rule passes validation, allowing content through - This prevents blocking users during API outages
- Best for: Production environments prioritizing user experience
Strict Mode (fail_on_api_error = true):
- When the Azure API fails, the
SafeImagevalidation rule fails with message: "Unable to validate :attribute safety. Please try again." - Content cannot be moderated until the API is available
- Best for: High-security environments requiring strict content moderation enforcement
Configuration:
# Default: false (graceful degradation) AZURE_MODERATOR_FAIL_ON_ERROR=false # Strict mode: true (fail validation on API errors) AZURE_MODERATOR_FAIL_ON_ERROR=true
Retry Logic: The package includes automatic retry logic with exponential backoff for:
- Rate limit errors (429)
- Server errors (500, 503)
- Up to 3 retry attempts per request
Multimodal Analysis (Preview)
⚠️ Preview API: Uses
2024-09-15-preview. Feature availability varies by Azure region.
Analyze images with associated text for contextual content moderation:
use Gowelle\AzureModerator\MultimodalService; $service = app(MultimodalService::class); // Analyze image with caption $result = $service->analyze( image: $base64ImageData, text: 'User-provided caption', encoding: 'base64', enableOcr: true ); if ($result->isFlagged()) { echo $result->reason; // "High severity in: Violence" }
SafeMultimodal Validation Rule
use Gowelle\AzureModerator\Rules\SafeMultimodal; $request->validate([ 'image' => [ 'required', 'image', new SafeMultimodal( text: $request->caption, categories: ['Sexual', 'Violence'] ) ], ]);
CLI Testing
# Test with URL php artisan azure-moderator:test-multimodal https://example.com/image.jpg --text="Caption" # Test with local file php artisan azure-moderator:test-multimodal ./path/to/image.jpg --local --text="Caption"
Multi-Modal Analysis (Batch & Async)
Process multiple items or perform context-aware analysis:
// Batch Moderation $results = AzureModerator::moderateBatch([ ['type' => 'text', 'content' => 'Comment 1', 'rating' => 4.5], ['type' => 'image', 'content' => 'https://example.com/img.jpg'], ]); // Context-Aware (Text + Image) $result = AzureModerator::moderateWithContext( text: 'Check this out!', imageUrl: 'https://example.com/meme.jpg', rating: 4.0 );
For background processing, dispatch the job:
use Gowelle\AzureModerator\Jobs\ModerateContentJob; ModerateContentJob::dispatch( contentType: 'text', content: 'User bio update', rating: 4.5, metadata: ['user_id' => 123] );
Custom Blocklists
Manage custom blocklists to filter specific terms.
# Create and manage lists via CLI php artisan azure-moderator:blocklist create my-list "Banned words" php artisan azure-moderator:blocklist add-item my-list "forbidden_term"
Use in moderation:
$result = AzureModerator::moderate( text: 'Some text', rating: 4.5, blocklistNames: ['my-list'] );
See Blocklists Guide for full details.
Protected Material Detection
Detect copyrighted content in text:
php artisan azure-moderator:test-protected "Lyrics to a song..."
Or use the validation rule:
use Gowelle\AzureModerator\Rules\SafeText; $request->validate([ 'content' => ['required', new SafeText()], ]);
See Protected Material Guide for details.
Artisan Commands
Test image moderation from the command line:
# Test image moderation php artisan azure-moderator:test-image https://example.com/image.jpg # Test with specific categories php artisan azure-moderator:test-image https://example.com/image.jpg --categories=Sexual,Violence
Testing
This package includes a comprehensive test suite with unit tests, integration tests, and performance benchmarks.
Running Tests
# Run unit tests composer test # Run integration tests (requires Azure credentials) composer test:integration # Run performance benchmarks composer test:performance # Run all tests composer test:all # Generate coverage report composer test-coverage
Integration Tests
Integration tests validate the package against the real Azure Content Safety API. To run them:
-
Copy the example environment file:
cp .env.integration.example .env.integration
-
Add your Azure credentials to
.env.integration:AZURE_CONTENT_SAFETY_ENDPOINT=https://your-resource.cognitiveservices.azure.com AZURE_CONTENT_SAFETY_API_KEY=your-api-key
-
Run integration tests:
composer test:integration
Test Coverage:
- 30+ unit tests
- 50+ integration tests (Azure API)
- 10+ performance benchmarks
- Total: 90+ tests with 100% pass rate
See Integration Testing Guide for detailed documentation.
Quality Tools
# Run PHPStan static analysis (level 6) composer analyse # Run mutation testing composer mutate # Check code style composer format # Run all quality checks composer quality
CI/CD
GitHub Actions automatically runs:
- Unit tests (PHP 8.2 & 8.3)
- Integration tests (when secrets are configured)
- PHPStan static analysis
- Code style checks
To enable integration tests in CI, add these secrets to your repository:
AZURE_CONTENT_SAFETY_ENDPOINTAZURE_CONTENT_SAFETY_API_KEY
Documentation
- Blocklists Guide
- Protected Material Guide
- Integration Testing Guide
- Performance Testing Guide
- Troubleshooting Guide
- API Response Examples
- Roadmap
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security
If you discover any security related issues, please email gowelle.john@icloud.com instead of using the issue tracker.
Please review our Security Policy for more details.
Credits
License
The MIT License (MIT). Please see License File for more information.