Silverstripe Robots.txt generation
This module provides simple robots.txt generation for Silverstripe, with various configuration options available., (*1)
When a site is not in live mode (such as on a testing domain) it will respectively block the entire domain, ensuring that (at least respectful) search engines will refrain from indexing your test site., (*2)
Installation
Composer is the recommended way of installing SilverStripe modules., (*3)
composer require gorriecoe/silverstripe-robots
Requirements
Maintainers
Credit
This module is heavily inspired by silverstripe-robots by Damian Mooyman, (*4)
Configuration
You can add a page or pattern to be blocked by adding it to the disallowedUrls configuration, (*5)
gorriecoe\Robots\Robots:
disallowed_urls:
- 'mysecretpage.html'
- '_private'
- 'Documents-and-Settings/Ricky/My-Documents/faxes/sent-faxes'
Also by default, any page with 'ShowInSearch' set to false will also be excluded. This
can be useful for hiding auxiliary pages like "thanks for signing up", or error pages., (*6)
You can turn this off (if you really absolutely think you need to) using the below., (*7)
gorriecoe\Robots\Robots:
disallow_unsearchable: false
By default the module will check for a sitemap file in /sitemap.xml
. You can set a custom file location using the below configuration., (*8)
gorriecoe\Robots\Robots:
sitemap: '/sitemap.xml'
For multiple sitemaps., (*9)
gorriecoe\Robots\Robots:
sitemap:
- '/sitemap_index.xml'
- 'http://www.gstatic.com/s2/sitemap.xml'