mguinea / laravel-robots
Laravel package to manage robots in an easy way
Installs: 42 030
Dependents: 0
Suggesters: 0
Security: 0
Stars: 16
Watchers: 3
Forks: 3
Open Issues: 5
Type:laravel-package
Requires
- php: >=8.1
- illuminate/database: ^6.0|^7.0|^8.0|^9.0|^10.0|^11.0
Requires (Dev)
- orchestra/testbench: ^4.0|^5.0|^6.0|^7.0|^8.0|^9.0
- phpunit/phpunit: ^8.0|^9.0
- dev-master
- 3.1.4
- 3.1.2
- 3.1.1
- 3.1.0
- 3.0.4
- 3.0.3
- 3.0.2
- 3.0.1
- 3.0.0
- v2.x-dev
- v2.0.0
- v1.x-dev
- v1.1.0
- v1.0.1
- v1.0.0
- v0.1.2
- v0.1.1
- v0.1.0
- v0.0.1
- dev-dependabot/composer/league/commonmark-2.6.0
- dev-dependabot/composer/symfony/var-dumper-6.4.15
- dev-dependabot/composer/laravel/framework-9.52.17
- dev-dependabot/composer/symfony/http-foundation-6.4.14
- dev-dependabot/composer/symfony/process-6.4.14
- dev-develop
This package is auto-updated.
Last update: 2024-12-09 23:09:37 UTC
README
Laravel package to manage robots easily.
If you need a detailed explanation about how robots.txt file works, visit http://www.robotstxt.org/robotstxt.html
This package allows you to manage robots of your site dinamically allowing you to differenciate between environments or configurations.
Migration to persist configuration is optional; you can change its data source.
Once package is installed you can do these things:
Route::get('robots.txt', function() { $robots = new \Mguinea\Robots\Robots; // If on the live server if (App::environment() == 'production') { $robots->addUserAgent('*')->addSitemap('sitemap.xml'); } else { // If you're on any other server, tell everyone to go away. $robots->addDisallow("/"); } return response($robots->generate(), 200)->header('Content-Type', 'text/plain'); });
Installing
You can install via Composer.
composer require mguinea/laravel-robots
Running the tests
Just execute
vendor/bin/phpunit
Unit tests will test all methods from Robots class and its related facade.
Usage
1. Dynamically
You can use Robots in routes file to generate a dynamic response
Route::get('robots.txt', function() { $robots = new \Mguinea\Robots\Robots; // If on the live server if (App::environment() == 'production') { $robots->addUserAgent('*')->addSitemap('sitemap.xml'); } else { // If you're on any other server, tell everyone to go away. $robots->addDisallow("/"); } return response($robots->generate(), 200)->header('Content-Type', 'text/plain'); });
1.1. Dynamically with facade
You can use Robots facade in routes file to generate a dynamic response
<?php use Mguinea\Robots\Facades\Robots; Route::get('robots.txt', function() { // If on the live server if (App::environment() == 'production') { Robots::addUserAgent('*'); Robots::addSitemap('sitemap.xml'); } else { // If you're on any other server, tell everyone to go away. Robots::addDisallow("/"); } return response(Robots::generate(), 200)->header('Content-Type', 'text/plain'); });
2. To robots.txt default file
If you prefer to write the original robots.txt file, just use the generator as you have seen
<?php use Illuminate\Http\File; use Mguinea\Robots\Robots; class Anywhere { public function createFile() { $robots = new Robots; $robots->addUserAgent('*')->addSitemap('sitemap.xml'); File::put(public_path('robots.txt'), $robots->generate()); } }
3. Building from Data Source
You could prefer building it from some data source. To get that, you just must instantiate Robots object using an array with key value parameters as shown below.
Note that comments and spacers have been removed.
<?php use Illuminate\Http\File; use Mguinea\Robots\Robots; class Anywhere { public function fromArray() { $robots = new Robots([ 'allows' => [ 'foo', 'bar' ], 'disallows' => [ 'foo', 'bar' ], 'hosts' => [ 'foo', 'bar' ], 'sitemaps' => [ 'foo', 'bar' ], 'userAgents' => [ 'foo', 'bar' ], 'crawlDelay' => 10 ]); return response($robots->generate(), 200)->header('Content-Type', 'text/plain'); } }
Methods
You can use Robots class methods in an individual or nested way.
Remember that you can use Facade to avoid instantiation.
<?php // Add an allow rule to the robots. Allow: foo $robots->addAllow('foo'); // Add multiple allows rules to the robots. Allow: foo Allow: bar $robots->addAllow(['foo', 'bar']);
<?php // Add a comment to the robots. # foo $robots->addComment('foo');
<?php // Add a disallow rule to the robots. Disallow: foo $robots->addDisallow('foo'); // Add multiple disallows rules to the robots. Disallow: foo Disallow: bar $robots->addDisallow(['foo', 'bar']);
<?php // Add a Host to the robots. Host: foo $robots->addHost('foo'); // Add multiple hosts to the robots. Host: foo Host: bar $robots->addHost(['foo', 'bar']);
<?php // Add a Sitemap to the robots. Sitemap: foo $robots->addSitemap('foo'); // Add multiple sitemaps to the robots. Sitemap: foo Sitemap: bar $robots->addSitemap(['foo', 'bar']);
<?php // Add a spacer to the robots. $robots->addSpacer();
<?php // Add a User-agent to the robots. User-agent: foo $robots->addUserAgent('foo'); // Add multiple User-agents to the robots. User-agent: foo User-agent: bar $robots->addUserAgent(['foo', 'bar']);
<?php // Add a crawl-delay to the robots. crawl-delay: 10 $robots->addCrawlDelay(10);
<?php // Generate the robots data. $robots->generate();
<?php // Reset the rows. $robots->reset();
Built With
Contributing
Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests.
Security
If you discover any security related issues, please email develop.marcguinea@gmail.com instead of using the issue tracker.
Versioning
We use SemVer for versioning. For the versions available, see the tags on this repository.
License
This project is licensed under the MIT License - see the LICENSE file for details
Authors
- Marc Guinea MarcGuinea