ascentech / massive-csv-import
This lightweight package helps developers to import csv files with millions of records efficiently using Laravel Queues.
Installs: 14
Dependents: 0
Suggesters: 0
Security: 0
Stars: 1
Watchers: 1
Forks: 0
Open Issues: 0
pkg:composer/ascentech/massive-csv-import
README
- This lightweight package helps developers to import csv files with millions of records efficiently using Laravel Queues.
Prerequisites
- You must be using Laravel Queues and jobs table must exist in your database. If you are not using Queues, setup using this link.
- Write privileges on
storagedirectory of your Laravel project. You can change this location from configuration file of this package as well. - By default, this package tries to search required Model class from
App\Models\namespace. If you have placed Models in another directory, set its path in configuration file i.e.,vendor\ascentech\massive-csv-import\config\massive-csv-import.php.
Installation
- composer require ascentech/massive-csv-import
- Add
Ascentech\MassiveCsvImport\MassiveCsvImportServiceProvider::class,intoprovidersarray of your project'sconfig\app.phpfile.
Usage
- Prepare a large csv file (without headers) to import.
- Prepare a file upload interface in your project and write following two lines in your Controller code:
- use Ascentech\MassiveCsvImport\MassiveCsvImportFacade;
- $result = MassiveCsvImportFacade::import($path, $table_name, $columns);
$pathrefers to temp path of uploaded csv file$table_nameis the database table name in which you want to import large csv file.$columsis the array of columns for the particular table e.g., $columns = ['name','description','status'];- This package will create multiple smaller csv files from the large file and save these files into
storage\table_name\directory. By default the chunk size is 1000, you can editcsv_chunk_sizevariable's value in configuration file i.e.,vendor\ascentech\massive-csv-import\config\massive-csv-import.php. - A separate job is created for each smaller csv file for processing in the background.
- You will need to run
php artisan queue:workcommand for the jobs processing. - All processed files will be placed with
.csv-processedextension in the samestorage\table_name\directory. - Remember! If a particular record (from smaller csv file) fails to insert into the database, an error message will be written in laravel.log file, but the remaining job will keep processing without failing. A separate directory
storage\table_name\failedis automatically created which will have csv files with the failed records only. You can fix these and import later on in a separate csv file.
License
- MIT