Middleware to enable/disable the robots of the search engines

v2.0.1 2020-12-02 00:06 UTC

This package is auto-updated.

Last update: 2024-05-29 04:21:15 UTC


Latest Version on Packagist Software License Testing Total Downloads

Middleware to enable/disable the robots of the search engines for non-production environment. Adds automatically the header X-Robots-Tag in all responses and returns a default body for /robots.txt request.



This package is installable and autoloadable via Composer as middlewares/robots.

composer require middlewares/robots


$dispatcher = new Dispatcher([
    new Middlewares\Robots(false)

$response = $dispatcher->dispatch(new ServerRequest());

echo $response->getHeaderLine('X-Robots-Tag'); //noindex, nofollow, noarchive


The constructor's first argument configure whether block or not search engines.

//Disallow search engine robots
$robots = new Middlewares\Robots(false);

//Allow search engine robots
$robots = new Middlewares\Robots(true);

Optionally, you can provide a Psr\Http\Message\ResponseFactoryInterface as the second argument to create the response of the requests to /robots.txt. If it's not defined, Middleware\Utils\Factory will be used to detect it automatically.

$responseFactory = new MyOwnResponseFactory();

$robots = new Middlewares\Robots(false, $responseFactory);


If your site has a sitemap, use this option to add the url to robots.txt responses.

$robots = (new Middlewares\Robots(true))->sitemap('/sitemap.xml');

Please see CHANGELOG for more information about recent changes and CONTRIBUTING for contributing details.

The MIT License (MIT). Please see LICENSE for more information.