awcodes / botly
Botly is a Filament plugin to manage your site's robots.txt file directly from a Filament admin panel.
Fund package maintenance!
Requires
- php: ^8.2
- filament/filament: ^4.0|^5.0
- spatie/laravel-package-tools: ^1.15.0
Requires (Dev)
- larastan/larastan: ^3.0
- laravel/pint: ^1.0
- nunomaduro/collision: ^8.0
- orchestra/testbench: ^9.0|^10.0
- pestphp/pest: ^3.0|^4.0
- pestphp/pest-plugin-arch: ^3.0|^4.0
- pestphp/pest-plugin-laravel: ^3.0|^4.0
- pestphp/pest-plugin-livewire: ^3.0|^4.0
- rector/rector: ^2.0
- spatie/laravel-ray: ^1.26
README
Botly is a Filament plugin to manage your site's robots.txt file directly from the Filament admin panel. Rules, sitemaps, and AI crawler blocks are stored in the database and served dynamically — no static file required.
Installation
Install the package via Composer:
composer require awcodes/botly
Run the installation command to publish migrations and run them:
php artisan botly:install
Or publish and run the migration manually:
php artisan vendor:publish --tag="botly-migrations"
php artisan migrate
Optionally publish the config file:
php artisan vendor:publish --tag="botly-config"
Setup
Register the plugin in your Filament panel provider:
use Awcodes\Botly\BotlyPlugin; $panel->plugins([ BotlyPlugin::make(), ]);
That's it. Botly registers a Robots Manager page in your panel and automatically serves /robots.txt via a dynamic route.
How It Works
Botly stores your robots configuration in the database. When /robots.txt is requested, the rules are read from the database and formatted as valid robots.txt output on the fly. You can also export the current configuration to a static public/robots.txt file using the Export Robots.txt button on the admin page.
Important
If a static public/robots.txt file already exists, Botly will display a warning in the admin UI. The file must be deleted or renamed before the dynamic route can take effect.
Configuration
The published config file (config/botly.php) allows you to set default values that are used when no database record exists yet:
return [ 'defaults' => [ 'rules' => [], 'sitemaps' => [], 'ai_crawlers' => [], ], 'persistent_rules' => [], ];
Persistent Rules
Persistent rules are rules that are always included in the output and cannot be edited or deleted from the admin UI. You can define them in the config file or fluently on the plugin:
Via config:
// config/botly.php 'persistent_rules' => [ [ 'user_agent' => '*', 'directive' => 'disallow', 'path' => '/admin', ], ],
Via plugin:
BotlyPlugin::make() ->persistentRules([ [ 'user_agent' => '*', 'directive' => 'disallow', 'path' => '/admin', ], ]),
Each rule is an array with three keys:
| Key | Values |
|---|---|
user_agent |
Any string, e.g. *, Googlebot |
directive |
allow, disallow, crawl-delay, clean-param |
path |
The path to allow or disallow, e.g. /admin |
Customisation
Navigation
BotlyPlugin::make() ->navigationIcon('heroicon-o-robot') ->navigationGroup('Settings') ->navigationLabel('Robots.txt'),
Page
BotlyPlugin::make() ->title('Robots Manager') ->slug('robots-manager'),
AI Crawler Blocking
The admin page includes a Block AI Crawlers checkbox list. Selecting crawlers will add Disallow: / entries for each one in the output. Botly ships with a curated list of known AI crawlers including GPTBot, ClaudeBot, PerplexityBot, and more.
Testing
composer test
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Credits
License
The MIT License (MIT). Please see License File for more information.