Generate custom robots.txt for subsites

1.0 2017-05-17 05:45 UTC

This package is auto-updated.

Last update: 2021-11-25 04:04:58 UTC


Generate custom robots.txt for each subsite.

This module aims to prevent indexing of subsite-specific folder assets that belong to other subsites. It creates a robots.txt file with Disallow rules for folders belonging to other subsites (ie. not folders that are common or for the current subsite).


composer require rotassator/silverstripe-subsites-robotstxt

Live mode

Set the site to live mode to see subsite-specific robots.txt. On dev or test environments, robots are disallowed for all files.

See Environment management documentation for more details.

Example robots.txt for live site

For subsite:

# robots.txt for Example 1

User-agent: *
Disallow: assets/example2/
Disallow: assets/example2-documents/

For subsite:

# robots.txt for Example 2

User-agent: *
Disallow: assets/example1/

Example for non-live site

# robots.txt for Example 1

User-agent: *
Disallow: /