wesleydeveloper / data-processor
High-performance data import/export package for Laravel with cloud storage support, chunking, and queue processing
v0.1.1
2025-08-02 15:38 UTC
Requires
- php: ^8.3
- illuminate/filesystem: ^11.0|^12.0
- illuminate/queue: ^11.0|^12.0
- illuminate/support: ^11.0|^12.0
- illuminate/validation: ^11.0|^12.0
- openspout/openspout: ^4.30
Requires (Dev)
- fakerphp/faker: ^1.24
- laravel/pint: ^1.24
- mockery/mockery: ^1.6
- orchestra/testbench: ^9.0|^10.0
- phpstan/phpstan: ^2.1
- phpunit/phpunit: ^11.0|^12.0
README
A high-performance Laravel package for importing and exporting large datasets with cloud storage support, automatic chunking, queue processing, and memory-efficient generators.
Built on top of OpenSpout for maximum performance and minimal memory usage.
🌟 Features
- ⚡ High Performance: Process millions of rows with minimal memory usage
- ☁️ Cloud Storage: Native support for AWS S3, Google Cloud Storage, Azure, and more
- 🔄 Auto Chunking: Automatically splits large files into smaller chunks
- 📋 Queue Support: Background processing with Laravel Queues
- 🧠 Memory Efficient: Uses PHP generators to handle large datasets
- 📁 Multiple Formats: Excel (XLSX), CSV, ODS support
- ✅ Data Validation: Built-in validation with Laravel's validator
- 🎯 Laravel Integration: Seamless integration with Laravel ecosystem
- 🧪 Well Tested: Comprehensive test suite with performance benchmarks
📋 Requirements
-- OpenSpout 4.0+
🐳 Docker
Use Docker para rodar a suíte de testes em um ambiente PHP 8.3 isolado:
# Build da imagem docker build -t data-processor-tests . # Run os testes (usa o código já copiado na imagem, sem volume mount) docker run --rm data-processor-tests
📦 Installation
You can install the package via composer:
bash composer require wesleydeveloper/data-processor
Publish the config file:
bash php artisan vendor:publish --tag="data-processor-config"
🚀 Quick Start
Import Data
Create an import class:
<?php namespace App\Imports; use Wesleydeveloper\DataProcessor\Contracts\Importable; use Wesleydeveloper\DataProcessor\Contracts\ShouldQueue; use Wesleydeveloper\DataProcessor\Contracts\WithChunking; use Illuminate\Support\Facades\DB; class UsersImport implements Importable, ShouldQueue, WithChunking { public function rules(): array { return [ 'name' => 'required|string|max:255', 'email' => 'required|email|unique:users,email', 'phone' => 'nullable|string' ]; } public function map(array $row): array { return [ 'name' => $row[0], 'email' => $row[1], 'phone' => $row[2] ?? null, 'created_at' => now(), 'updated_at' => now() ]; } public function process(array $data): void { DB::table('users')->insert($data); } public function chunkSize(): int { return 1000; } // Queue configuration public function onQueue(): ?string { return 'imports'; } public function timeout(): int { return 300; } public function memory(): int { return 512; } // Chunking configuration public function maxFileSize(): int { return 50 * 1024 * 1024; // 50MB } public function chunkRows(): int { return 10000; } }
Process the import:
use Wesleydeveloper\DataProcessor\Facades\DataProcessor; use App\Imports\UsersImport; DataProcessor::import(new UsersImport(), 'users-import.xlsx');
Export Data
Create an export class:
<?php namespace App\Exports; use Wesleydeveloper\DataProcessor\Contracts\Exportable; use App\Models\User; use Generator; class UsersExport implements Exportable { public function query(): Generator { User::chunk(1000, function ($users) { foreach ($users as $user) { yield $user; } }); } public function headings(): array { return ['ID', 'Name', 'Email', 'Created At']; } public function map($user): array { return [ $user->id, $user->name, $user->email, $user->created_at->format('Y-m-d H:i:s') ]; } public function chunkSize(): int { return 1000; } }
Process the export:
use Wesleydeveloper\DataProcessor\Facades\DataProcessor; use App\Exports\UsersExport; DataProcessor::export(new UsersExport(), 'users-export.xlsx');