You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Adds a Robots.txt file that is configurable from /admin/settings/.
9
9
10
-
This module has been inspired by MichaelJJames' [silverstripe-robots](https://github.com/MichaelJJames/silverstripe-robots) and Symbiote's [Multisites](https://github.com/symbiote/silverstripe-multisites).
10
+
This module supports single site as well as [multisites](https://github.com/symbiote/silverstripe-multisites) setups.
11
11
12
12
## Requirements
13
13
14
-
*SilverStripe CMS 4.x
14
+
*Silverstripe CMS 4.x
15
15
16
16
## Installation
17
17
@@ -23,30 +23,51 @@ Then run dev/build.
23
23
24
24
## Configuration
25
25
26
-
You can disallow robots from accessing your site using the config.
26
+
On the SiteConfig (or Site is Multisites is installed) there is a setting in the CMS that lets you set the robots mode. The three options are:
27
+
* Allow all
28
+
* Disallow all
29
+
* Custom content
27
30
28
-
You can add the following code to your .env file to disallow robots for an environment.
31
+
The output of all three states is managed through templates and can be overwritten for an app or theme.
32
+
33
+
### Allow all
34
+
35
+
When switched to 'allow all' the module uses the template `Innoweb/Robots/RobotsController_allow.ss` with the following default content:
36
+
37
+
```
38
+
<% if $GoogleSitemapURL %>Sitemap: {$GoogleSitemapURL}<% end_if %>
39
+
User-agent: *
40
+
Disallow: /dev/
41
+
Disallow: /admin/
42
+
Disallow: /Security/
29
43
```
30
-
ROBOTS_DISALLOW="true"
31
-
```
32
44
33
-
You can also disallow access by using the yml config:
45
+
The module checks whether the [Google Sitemaps module](https://github.com/wilr/silverstripe-googlesitemaps) is installed and injects the sitemap URL automatically.
46
+
47
+
It allows access to all pages and disallows access to development and security URLs by default.
48
+
49
+
### Disallow all
50
+
51
+
When switched to 'disallow all' the module uses the template `Innoweb/Robots/RobotsController_disallow.ss` with the following default content:
52
+
34
53
```
35
-
RobotsController:
36
-
disallow_robots: true
54
+
UserAgent: *
55
+
Disallow: /
37
56
```
38
57
39
-
The default value for this configuration is `false`, allowing access to the site (except if the romots.txt field content is empty, see below).
58
+
This disallows all robots from accessing any page on the site.
40
59
41
-
Both these methods will add a default robots.txt content in place, that will disallow robots from accessing your site.
60
+
### Custom content
42
61
43
-
## robots.txt content
62
+
This setting reveals a text field in the CMS where custom code can be entered.
44
63
45
-
To add a romots.txt to you site, paste in your robots.txt configuration into the textarea inside the robots tab in admin/settings/.
64
+
The template contains the following code and doesn't add anything to the custom code entered:
46
65
47
-
### Example content
66
+
```
67
+
$RobotsContent.RAW
68
+
```
48
69
49
-
Here a good standard robots.txt configuration:
70
+
A good standard robots.txt configuration for Silverstripe looks as follows. This is used as default when the module is switched to 'allow all':
50
71
51
72
```
52
73
Sitemap: https://www.example.com/sitemap.xml
@@ -56,8 +77,6 @@ Disallow: /admin/
56
77
Disallow: /Security/
57
78
```
58
79
59
-
If you leave the field empty, a default content will be used that blocks robots from accessing the site.
0 commit comments