Adds a Robots.txt file that is configurable from /admin/settings/.

Installs: 2 890

Dependents: 1

Suggesters: 0

Security: 0

Stars: 0

Watchers: 3

Forks: 1

Open Issues: 0


3.1.0 2020-12-28 10:19 UTC

This package is auto-updated.

Last update: 2021-08-28 11:36:18 UTC


Version License


Adds a Robots.txt file that is configurable from /admin/settings/ and injects robots meta tag into all pages.

This module supports single site as well as multisites setups.


  • Silverstripe CMS 4.x


Install the module using composer:

composer require innoweb/silverstripe-robots dev-master

Then run dev/build.



On the SiteConfig (or Site is Multisites is installed) there is a setting in the CMS that lets you set the robots mode. The three options are:

  • Allow all
  • Disallow all
  • Custom content

The output of all three states is managed through templates and can be overwritten for an app or theme.

Allow all

When switched to 'allow all' the module uses the template Innoweb/Robots/ with the following default content:

<% if $GoogleSitemapURL %>Sitemap: {$GoogleSitemapURL}<% end_if %>
User-agent: *
Disallow: /dev/
Disallow: /admin/
Disallow: /Security/

The module checks whether the Google Sitemaps module is installed and injects the sitemap URL automatically.

It allows access to all pages and disallows access to development and security URLs by default.

Disallow all

When switched to 'disallow all' the module uses the template Innoweb/Robots/ with the following default content:

UserAgent: *
Disallow: /

This disallows all robots from accessing any page on the site.

Custom content

This setting reveals a text field in the CMS where custom code can be entered.

The template contains the following code and doesn't add anything to the custom code entered:


A good standard robots.txt configuration for Silverstripe looks as follows. This is used as default when the module is switched to 'allow all':

User-agent: *
Disallow: /dev/
Disallow: /admin/
Disallow: /Security/

Robots meta tag

The module injects a robots meta tag into every page. The injection of the meta tag can be disabled using the following config, e.g. if the robots meta tag is managed manually in the template:

  robots_enable_metatag: false

By default, all pages are set to index, follow with the following exceptions:

  • The Robots.txt setting on the site if set to 'Disallow all'
  • The environment is set to test or dev
  • The current page is displayed by the Security controller
  • The Priority setting for the page is -1 (see Google Sitemaps module)

Additionally, for each page type a config value can be set to control the meta tag. By default, the following values are set:

  robots_noindex: false
  robots_nofollow: false

  robots_noindex: true
  robots_nofollow: true

  robots_noindex: true
  robots_nofollow: true

This can be customised for any custom page types as needed.


BSD 3-Clause License, see License