awcodes/botly

Botly is a Filament plugin to manage your site's robots.txt file directly from a Filament admin panel.

Maintainers

Package info

github.com/awcodes/botly

pkg:composer/awcodes/botly

Fund package maintenance!

awcodes

Statistics

Installs: 4

Dependents: 0

Suggesters: 0

Stars: 0

Open Issues: 0

v1.0.0 2026-03-05 21:25 UTC

This package is auto-updated.

Last update: 2026-03-05 21:30:22 UTC


README

Botly is a Filament plugin to manage your site's robots.txt file directly from the Filament admin panel. Rules, sitemaps, and AI crawler blocks are stored in the database and served dynamically — no static file required.

Latest Version MIT Licensed Total Downloads GitHub Repo stars Filament Version Filament Version

Installation

Install the package via Composer:

composer require awcodes/botly

Run the installation command to publish migrations and run them:

php artisan botly:install

Or publish and run the migration manually:

php artisan vendor:publish --tag="botly-migrations"
php artisan migrate

Optionally publish the config file:

php artisan vendor:publish --tag="botly-config"

Setup

Register the plugin in your Filament panel provider:

use Awcodes\Botly\BotlyPlugin;

$panel->plugins([
    BotlyPlugin::make(),
]);

That's it. Botly registers a Robots Manager page in your panel and automatically serves /robots.txt via a dynamic route.

How It Works

Botly stores your robots configuration in the database. When /robots.txt is requested, the rules are read from the database and formatted as valid robots.txt output on the fly. You can also export the current configuration to a static public/robots.txt file using the Export Robots.txt button on the admin page.

Important

If a static public/robots.txt file already exists, Botly will display a warning in the admin UI. The file must be deleted or renamed before the dynamic route can take effect.

Configuration

The published config file (config/botly.php) allows you to set default values that are used when no database record exists yet:

return [
    'defaults' => [
        'rules' => [],
        'sitemaps' => [],
        'ai_crawlers' => [],
    ],
    'persistent_rules' => [],
];

Persistent Rules

Persistent rules are rules that are always included in the output and cannot be edited or deleted from the admin UI. You can define them in the config file or fluently on the plugin:

Via config:

// config/botly.php
'persistent_rules' => [
    [
        'user_agent' => '*',
        'directive' => 'disallow',
        'path' => '/admin',
    ],
],

Via plugin:

BotlyPlugin::make()
    ->persistentRules([
        [
            'user_agent' => '*',
            'directive' => 'disallow',
            'path' => '/admin',
        ],
    ]),

Each rule is an array with three keys:

Key Values
user_agent Any string, e.g. *, Googlebot
directive allow, disallow, crawl-delay, clean-param
path The path to allow or disallow, e.g. /admin

Customisation

Navigation

BotlyPlugin::make()
    ->navigationIcon('heroicon-o-robot')
    ->navigationGroup('Settings')
    ->navigationLabel('Robots.txt'),

Page

BotlyPlugin::make()
    ->title('Robots Manager')
    ->slug('robots-manager'),

AI Crawler Blocking

The admin page includes a Block AI Crawlers checkbox list. Selecting crawlers will add Disallow: / entries for each one in the output. Botly ships with a curated list of known AI crawlers including GPTBot, ClaudeBot, PerplexityBot, and more.

Testing

composer test

Contributing

Please see CONTRIBUTING for details.

Security Vulnerabilities

Please review our security policy on how to report security vulnerabilities.

Credits

License

The MIT License (MIT). Please see License File for more information.