keggermont/laravel-amphp

Boost Laravel performance with AmPHP's asynchronous capabilities. Run parallel tasks, HTTP requests, and database operations with automatic chunking while retaining a simple, expressive API. Perfect for improving API response times and resource utilization.

dev-main 2025-04-28 20:06 UTC

This package is not auto-updated.

Last update: 2025-04-29 02:51:26 UTC


README

A Laravel package that integrates the AmPHP asynchronous PHP framework to enhance concurrency and performance in your Laravel applications.

What is AmPHP and Fibers?

AmPHP is a non-blocking concurrency framework for PHP that allows you to write asynchronous code in a more readable, synchronous style. Starting with PHP 8.1, the language introduced native Fibers, which are lightweight threads (or coroutines) that enable cooperative multitasking within PHP.

Key concepts:

  • Fibers allow functions to pause execution and later resume from the same point without blocking the entire process
  • Non-blocking I/O means your application can handle multiple operations concurrently (especially I/O bound operations like HTTP requests, database queries, and file operations)
  • Cooperative multitasking lets your code decide when to yield control, making concurrent programming more predictable

This package leverages AmPHP and Fibers to bring these powerful concurrency features to Laravel in a simple, easy-to-use way.

Why Use Laravel AmPHP?

This package provides significant benefits for Laravel applications:

  • Improved performance for I/O bound operations by running tasks concurrently
  • Reduced response times for API endpoints that need to call multiple external services
  • Better resource utilization by keeping your CPU busy while waiting for I/O operations
  • Simple API that integrates seamlessly with Laravel's existing patterns
  • Zero configuration required to get started

Perfect for applications that:

  • Make multiple API calls
  • Process data in batches
  • Need to reduce response times for complex operations
  • Want to improve server resource utilization

Automatic Task Chunking

One of the key features of this package is automatic task chunking. Here's why it matters:

Why Chunking?

When dealing with a large number of parallel tasks:

  1. Resource Management: Running too many concurrent tasks can exhaust system resources (memory, file descriptors, sockets)
  2. Diminishing Returns: Beyond a certain point, adding more concurrent tasks doesn't improve performance and can even degrade it
  3. Server Limits: External services often have rate limits or connection limits

How Chunking Works

The package implements an intelligent chunking system:

  • Tasks are automatically divided into smaller batches ("chunks")
  • Each chunk is processed concurrently
  • Once a chunk completes, the next chunk begins processing
  • Results are combined seamlessly as if all tasks ran at once

For example, if you have 1000 items to process and a chunk size of 100:

  • Instead of 1000 concurrent tasks (which could overwhelm the system)
  • The system runs 100 tasks at a time, for 10 batches
  • You still get maximum performance without resource exhaustion

Configurable Chunking

The chunking system is:

  • Enabled by default with sensible defaults
  • Fully configurable with custom chunk sizes per method
  • Adaptive to different types of operations (HTTP requests use different defaults than data processing)
  • Optional if you prefer to manage concurrency yourself

This approach gives you the benefits of parallel processing while protecting your application from resource overconsumption.

Installation

Install the package via Composer:

composer require keggermont/laravel-amphp

The package will automatically register the service provider and facade through Laravel's auto-discovery.

If you want to customize the configuration, you can publish the configuration file:

php artisan vendor:publish --tag=laravel-amphp-config

Usage

Helper Functions

The package provides simple global helper functions:

// Run multiple closures in parallel
$results = parallel_run([
    function() {
        sleep(1);
        return 'Task 1 complete';
    },
    function() {
        sleep(2);
        return 'Task 2 complete';
    },
    function() {
        sleep(1);
        return 'Task 3 complete';
    }
]);
// Total execution time: ~2 seconds instead of 4 seconds

// Process a collection in parallel
$items = collect([1, 2, 3, 4, 5]);
$results = parallel_map($items, function($item) {
    // Simulate some time-consuming work
    sleep(1);
    return $item * 2;
});
// Returns: [2, 4, 6, 8, 10] in ~1 second instead of 5 seconds

// Make HTTP requests in parallel
$responses = parallel_http([
    'https://example.com',
    'https://php.net',
    'https://github.com'
]);
// All requests run concurrently

// Wait for all tasks to complete
$success = parallel_all([
    function() { return doSomething(); },
    function() { return doSomethingElse(); }
]);
// Returns true if all tasks succeed, false otherwise

// Get the first successful result
$result = parallel_any([
    function() { 
        sleep(3);
        return 'Slow task finished'; 
    },
    function() { 
        sleep(1);
        return 'Fast task finished'; 
    }
]);
// Returns: 'Fast task finished' almost immediately

Using the Facade

You can also use the Parallel facade for a more object-oriented approach:

use Keggermont\LaravelAmphp\Facades\Parallel;

// Run multiple closures in parallel
$results = Parallel::run([
    function() { return 'result 1'; },
    function() { return 'result 2'; }
]);

// Process items in parallel
$results = Parallel::map([1, 2, 3, 4, 5], function($item) {
    return $item * 2;
});

// Make HTTP requests in parallel
$responses = Parallel::http([
    'https://example.com',
    'https://php.net'
]);

// More methods
$allSucceeded = Parallel::all([...]);
$firstSuccess = Parallel::any([...]);

Collection Macros

This package extends Laravel's Collection with macros for parallel processing:

// Process a collection in parallel
$collection = collect([1, 2, 3, 4, 5]);
$results = $collection->parallel(function($item) {
    // Simulating time-consuming work
    sleep(1);
    return $item * 2;
});
// Returns a new collection with [2, 4, 6, 8, 10]

// Execute a collection of closures and ensure all succeed
$tasks = collect([
    function() { return doSomething(); },
    function() { return doSomethingElse(); }
]);
$allSucceeded = $tasks->parallelAll();

// Execute a collection of closures and get the first successful result
$strategies = collect([
    function() { /* slow operation */ },
    function() { /* fast operation */ },
    function() { /* medium operation */ }
]);
$firstSuccess = $strategies->parallelAny('default value if all fail');

HTTP Requests in Detail

The parallel HTTP functionality accepts various configuration options:

$responses = parallel_http([
    'https://example.com',
    'https://api.example.com/users'
], [
    'method' => 'POST',
    'headers' => [
        'Authorization' => 'Bearer token',
        'Content-Type' => 'application/json'
    ],
    'body' => json_encode(['name' => 'John'])
]);

// Each response has this structure:
[
    'status' => 200,
    'body' => '{"result": "success"}',
    'headers' => ['Content-Type' => 'application/json', ...],
    'url' => 'https://example.com' // The original URL requested
]

Performance Benefits

Real-world performance improvements when using this package:

Operation Sequential Parallel Improvement
10 API calls (200ms each) ~2000ms ~250ms 8x faster
Processing 100 items (50ms each) ~5000ms ~500ms (on 8 core CPU) 10x faster
Multiple file operations Blocked by slowest I/O Limited only by CPU Significant

Advanced Configuration

The package includes configuration options for fine-tuning performance:

// config/laravel-amphp.php
return [
    // Enable automatic chunking for large workloads
    'auto_chunk' => true,
    
    // Default chunk size for batching operations
    'default_chunk_size' => 100,
    
    // Method-specific chunk sizes
    'chunk_sizes' => [
        'run' => 50,
        'map' => 100,
        'http' => 20,
        'all' => 100,
        'any' => 20,
    ],
];

Laravel Integration

The package seamlessly integrates with Laravel's ecosystem:

Artisan Commands

The package provides several Artisan commands to help you manage and optimize your application:

# Display information about the Laravel AmPHP package
php artisan laravel-amphp:info

# Cache the Laravel AmPHP configuration for improved performance
php artisan laravel-amphp:cache

# Clear the Laravel AmPHP cached configuration
php artisan laravel-amphp:clear-cache

Laravel Optimize Integration

The package integrates with Laravel's built-in optimization commands. When you run:

php artisan optimize

The Laravel AmPHP configuration will also be cached for better performance.

Similarly, when you run:

php artisan optimize:clear

The Laravel AmPHP cache will be cleared along with other Laravel caches.

About Command Integration

The package also integrates with Laravel's about command. To see information about the package:

php artisan about

This will display the package version, chunking status, and default chunk size along with other information about your Laravel application.

Examples

Check the examples directory for more usage examples, including:

  • Parallel database queries
  • Complex API integrations
  • Batch processing workflows
  • Advanced error handling

Testing Your Code

When testing code that uses AmPHP, you can use the included MockHttpClient for HTTP tests:

use Keggermont\LaravelAmphp\Tests\Unit\Http\MockHttpClient;

// Create a mock client with predefined responses
$mockClient = new MockHttpClient();
$mockClient->addResponse('https://example.com', [
    'status' => 200,
    'body' => 'Example response',
    'headers' => ['Content-Type' => 'text/plain']
]);

// Bind the mock client to the container
$this->app->instance('laravel-amphp.http-client', $mockClient);

// Now your tests using Parallel::http() will use the mock client

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This package is open-sourced software licensed under the MIT license.