gowelle / azure-moderator
Azure Content Moderator wrapper for Laravel
Installs: 131
Dependents: 0
Suggesters: 0
Security: 0
Stars: 0
Watchers: 1
Forks: 0
Open Issues: 0
pkg:composer/gowelle/azure-moderator
Requires
- php: ^8.2
- guzzlehttp/guzzle: ^7.9
- illuminate/support: ^10.0 || ^11.0 || ^12.0
Requires (Dev)
- laravel/pint: ^1.0
- mockery/mockery: ^1.6
- orchestra/testbench: ^8.0
- pestphp/pest: ^2.0
- pestphp/pest-plugin-laravel: ^2.0
- spatie/laravel-package-tools: ^1.0
README
A Laravel package for content moderation using Azure Content Safety API. This package helps you analyze both text and image content for potentially harmful material, automatically flagging or approving content based on Azure's AI-powered analysis.
Features
- Easy integration with Azure Content Safety API
- Text and Image content moderation
- Automatic content analysis and flagging
- Configurable severity thresholds
- User rating support (for text moderation)
- Laravel validation rules for images
- Artisan command for testing
- Retry handling for API failures
- Laravel-native configuration
- Extensive logging
Requirements
- PHP 8.2 or higher
- Laravel 10.0 or higher
- Azure Content Safety API subscription
Installation
Install the package via composer:
composer require gowelle/azure-moderator
Publish the configuration file:
php artisan vendor:publish --provider="Gowelle\AzureModerator\AzureContentSafetyServiceProvider"
Configuration
Add your Azure credentials to your .env file:
AZURE_CONTENT_SAFETY_ENDPOINT=your-endpoint AZURE_CONTENT_SAFETY_API_KEY=your-api-key AZURE_CONTENT_SAFETY_LOW_RATING_THRESHOLD=2 AZURE_CONTENT_SAFETY_HIGH_SEVERITY_THRESHOLD=6 AZURE_MODERATOR_FAIL_ON_ERROR=false
Configuration Options
AZURE_CONTENT_SAFETY_ENDPOINT: Your Azure Content Safety API endpoint URLAZURE_CONTENT_SAFETY_API_KEY: Your Azure API key (keep this secure!)AZURE_CONTENT_SAFETY_LOW_RATING_THRESHOLD: Minimum rating to approve text content (0-5, default: 2)AZURE_CONTENT_SAFETY_HIGH_SEVERITY_THRESHOLD: Minimum severity to flag content (0-7, default: 3)AZURE_MODERATOR_FAIL_ON_ERROR: Whether validation should fail when API is unavailable (default: false)
Usage
Text Moderation
Basic Usage
use Gowelle\AzureModerator\Facades\AzureModerator; // Moderate content $result = AzureModerator::moderate('Some text content', 4.5); // Check result if ($result['status'] === 'approved') { // Content is safe } else { // Content was flagged $reason = $result['reason']; }
Custom Categories
use Gowelle\AzureModerator\Enums\ContentCategory; $result = AzureModerator::moderate( text: 'Some text content', rating: 4.5, categories: [ ContentCategory::HATE->value, ContentCategory::VIOLENCE->value ] );
Image Moderation
Basic Image Moderation
use Gowelle\AzureModerator\Facades\AzureModerator; // Moderate image by URL $result = AzureModerator::moderateImage('https://example.com/image.jpg'); // Check result if ($result['status'] === 'approved') { // Image is safe } else { // Image was flagged $reason = $result['reason']; $scores = $result['scores']; // Detailed severity scores }
Base64 Image Moderation
// Moderate uploaded image $imageData = file_get_contents($uploadedFile->getRealPath()); $base64Image = base64_encode($imageData); $result = AzureModerator::moderateImage( image: $base64Image, encoding: 'base64' );
Note: Base64 images are limited to 4MB of encoded data, which corresponds to approximately 3MB of original image size (due to base64 encoding overhead of ~33%).
Image Moderation with Custom Categories
use Gowelle\AzureModerator\Enums\ContentCategory; $result = AzureModerator::moderateImage( image: 'https://example.com/image.jpg', categories: [ ContentCategory::SEXUAL->value, ContentCategory::VIOLENCE->value ] );
Laravel Validation
Use the SafeImage validation rule to automatically validate uploaded images:
use Gowelle\AzureModerator\Rules\SafeImage; // In your form request or controller $request->validate([ 'avatar' => ['required', 'image', 'max:2048', new SafeImage()], ]); // With custom categories $request->validate([ 'profile_picture' => [ 'required', 'image', new SafeImage([ ContentCategory::SEXUAL->value, ContentCategory::VIOLENCE->value ]) ], ]);
Error Handling
The package provides flexible error handling to ensure both security and user experience:
use Gowelle\AzureModerator\Exceptions\ModerationException; try { $result = AzureModerator::moderate('Some text content', 4.5); } catch (ModerationException $e) { // Handle API errors (only thrown for input validation errors in moderate()) Log::error('Moderation failed', [ 'message' => $e->getMessage(), 'endpoint' => $e->endpoint, 'status' => $e->statusCode ]); }
Graceful Degradation and Strict Mode
The fail_on_api_error configuration controls how the package behaves when the Azure API is unavailable:
Default Behavior (fail_on_api_error = false):
- When the Azure API fails or is unavailable, both
moderate()andmoderateImage()return approved status - The
SafeImagevalidation rule passes validation, allowing content through - This prevents blocking users during API outages
- Best for: Production environments prioritizing user experience
Strict Mode (fail_on_api_error = true):
- When the Azure API fails, the
SafeImagevalidation rule fails with message: "Unable to validate :attribute safety. Please try again." - Content cannot be moderated until the API is available
- Best for: High-security environments requiring strict content moderation enforcement
Configuration:
# Default: false (graceful degradation) AZURE_MODERATOR_FAIL_ON_ERROR=false # Strict mode: true (fail validation on API errors) AZURE_MODERATOR_FAIL_ON_ERROR=true
Retry Logic: The package includes automatic retry logic with exponential backoff for:
- Rate limit errors (429)
- Server errors (500, 503)
- Up to 3 retry attempts per request
Artisan Commands
Test image moderation from the command line:
# Test image moderation php artisan azure-moderator:test-image https://example.com/image.jpg # Test with specific categories php artisan azure-moderator:test-image https://example.com/image.jpg --categories=Sexual,Violence
Testing
composer test
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security
If you discover any security related issues, please email security@gowelle.com instead of using the issue tracker.
Please review our Security Policy for more details.
Credits
License
The MIT License (MIT). Please see License File for more information.