becklyn/robots-txt

A library to build robots.txt

1.1.0 2022-02-18 08:17 UTC

This package is auto-updated.

Last update: 2024-12-18 14:25:32 UTC


README

Usage

First, you need to create a builder. With this builder you can create sections for different user agents and add the directives to it.

You can also add a header to the robots.txt and register your sitemap URLs.

Adding Sections

use Becklyn\RobotsTxt\Builder\RobotsTxtBuilder;

$builder = new RobotsTxtBuilder();

// adding a section
$builder->getSection("google")
    ->allow("/public")
    ->disallow("/admin")
    ->crawlDelay(20);
    
$builder->getSection("bing")
    ->allow("/public")
    ->disallow("/admin")
    ->disallow("/private")
    ->crawlDelay(15);

If multiple search engines have the same directives, you can add one section for all of them:

$builder->getSection("google", "bing")
    ->allow("/public")
    ->disallow("/admin")
    ->crawlDelay(20);

The builder tries to bundle combine the directives of multiple sections if possible:

$builder->getSection("google")
    ->allow("/public");
    
// ... some code ...

$builder->getSection("google")
    ->disallow("/admin")
    
// will produce a single entry:
//
//      User-Agent: google
//      Allow: /public
//      Disallow: /admin

Sitemaps

$builder
    ->addSitemap("https://example.org/sitemap.xml.tar.gz")
    ->addSitemap("https://example.org/sitemap.xml");

Header

You can also add a header which will be included at the very top:

$builder
    ->setHeader("This is some example text");
    
$builder->getSection("google")
    ->allow("/public");
    
// Will produce:
//
//      # This is some example text
//
//      User-Agent: google
//      Allow: /public

Outputting the robots.txt

$content = $builder->getFormatted();
file_put_contents("robots.txt", $content);