Generate custom robots.txt
for each subsite.
This module aims to prevent indexing of subsite-specific folder assets that belong to other subsites. It creates a robots.txt
file with Disallow
rules for folders belonging to other subsites (ie. not folders that are common or for the current subsite).
composer require rotassator/silverstripe-subsites-robotstxt
Set the site to live mode to see subsite-specific robots.txt
. On dev
or test
environments, robots are disallowed for all files.
See Environment management documentation for more details.
For example1.com
subsite:
# robots.txt for Example 1
User-agent: *
Disallow: assets/example2/
Disallow: assets/example2-documents/
For example2.com
subsite:
# robots.txt for Example 2
User-agent: *
Disallow: assets/example1/
# robots.txt for Example 1
User-agent: *
Disallow: /