justinkekeocha / database-dump
This package will save you from loosing database records, supposing you run the laravel migrate:fresh command without exporting a database dump
Requires
- php: ^8.1
- illuminate/contracts: ^10.0|^11.0
- spatie/laravel-package-tools: ^1.14.0
Requires (Dev)
- laravel/pint: ^1.0
- nunomaduro/collision: ^7.8
- nunomaduro/larastan: ^2.0.1
- orchestra/testbench: ^8.8
- pestphp/pest: ^2.20
- pestphp/pest-plugin-arch: ^2.0
- pestphp/pest-plugin-laravel: ^2.0
- phpstan/extension-installer: ^1.1
- phpstan/phpstan-deprecation-rules: ^1.0
- phpstan/phpstan-phpunit: ^1.0
- spatie/laravel-ray: ^1.26
README
This package enhances the migrate:fresh
command by creating a dump of your database, allowing you to make migration changes and then re-seed the database with the previous data. This is particularly useful for developers who need to preserve their data before running migrations.
Utilizing a memory-efficient method, this package streams records from the dump file, ensuring only one record is in memory at any given time. This approach allows it to handle theoretically infinite file sizes without exhausting memory.
Inspired by the export function in phpMyAdmin, this package not only enables you to restore your data but also provides the flexibility to alter the data during the seeding process.
Contents
Installation
You can install the package via composer:
composer require justinkekeocha/database-dump
You can publish the config file with:
php artisan vendor:publish --tag="database-dump-config"
These are the contents of the published config file:
return [ /* * Enable or disable the package. */ 'enable' => true, /* * Set the folder generated dumps should be save in. */ 'folder' => database_path('dumps/'), /* * Set the chunk length of data to be processed at once. */ 'chunk_length' => 5000, /* * Set the maximum stream length of data to be processed at once. * This is the maximum size a row in a table is expected to have in your database * This is set to a reasonable default of 1MB * If your database rows are larger than this, you may want to increase this value. * Read more: https://www.php.net/manual/en/function.stream-get-line.php */ 'stream_length' => (2 * 1024 * 1024), ];
Usage
Dump database data
# Dump database data before running migrations php artisan migrate:fresh # Dump database data php artisan database:dump
Seed database with dump file
Load dump file from DatabaseSeeder and pass the dump tables through the $this->call
method in the seeder class:
# database/seeders/DatabaseSeeder.php namespace Database\Seeders; use Illuminate\Database\Seeder; use Justinkekeocha\DatabaseDump\Facades\DatabaseDump; use Database\Seeders\UserSeeder; class DatabaseSeeder extends Seeder { /** * Seed the application's database. */ public function run(): void { $databaseDump = DatabaseDump::getLatestDump("save/2024_04_14_233109.json"); $this->command->outputComponents()->info("Using dump: $databaseDump->filePath"); $this->call([ UserSeeder::class, ], parameters: compact('databaseDump')); } }
The dump tables data are now available in individual seeder files and you can now seed the table with the data provided:
# database/seeders/UserSeeder.php namespace Database\Seeders; use App\Models\User; class UserSeeder extends Seeder { /** * Run the database seeds. */ public function run($databaseDump): void { $databaseDump->seed(User::class); //You can also use table name instead of model. $databaseDump->seed('users'); } }
You can manipulate the rows before seeding:
# database/seeders/CountrySeeder.php namespace Database\Seeders; use App\Models\Country; class CountrySeeder extends Seeder { /** * Run the database seeds. */ public function run($databaseDump): void { $databaseDump->seed(Country::class, formatRowCallback: function ($row) { //331.69 ms return [ 'id' => $row['id'], 'name' => $row['name'], 'code' => 22 ]; //OR //338.95 ms $changes = [ 'code' => '22' ]; return collect($row)->only(['id', 'name'])->merge($changes)->all(); }); } }
Get specific dump file
use Justinkekeocha\DatabaseDump\Facades\DatabaseDump; //Get dump by position in array of directory listing //Array starts from latest dump file in specified config('database-dump.folder') DatabaseDump::getDump(1); //Get second dump in the array. //Get dump by dump file name DatabaseDump::getDump("2024_01_08_165939.json"); //Get the latest dump DatabaseDump::getLatestDump();
Seed table
use Justinkekeocha\DatabaseDump\Facades\DatabaseDump; use App\Models\Country; use App\Models\Timezone; use App\Models\User; DatabaseDump::getLatestDump()->seed(User::class); //You can seed multiple tables at once. DatabaseDump::getLatestDump()->seed(Country::class) ->seed(Timezone::class) ->seed(User::class);
When seeding from the same dump file, it is more efficient to call the seed method on the already instantiated class. This is because when the seed method is called first, it reads the whole file and generates a schema that stores the offset of the tables in the file before it starts the seeding action. This schema is created so subsequent seed calls on the same instance (obviously the same file) will just move to the file offset where the table was last found and start reading from the offset.
use Justinkekeocha\DatabaseDump\Facades\DatabaseDump; use App\Models\Country; use App\Models\Timezone; use App\Models\User; //Whole file will be read 3 times DatabaseDump::getLatestDump()->seed(Country::class); DatabaseDump::getLatestDump()->seed(Timezone::class); DatabaseDump::getLatestDump()->seed(User::class); //Whole file will be read only once. DatabaseDump::getLatestDump()->seed(Country::class) ->seed(Timezone::class) ->seed(User::class);
Sample
Sample dump can be found here
Testing
composer test
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Credits
License
The MIT License (MIT). Please see License File for more information.