Update robots.txt (#2632)

I have found that Bing/Yahoo/DuckDuckGo, Yandex and Google report crawl errors when using the default robots.txt. Specifically their bots will not crawl the the path '/' or any sub-paths. I agree that the current robots.txt should work and properly implements the specification. However it still does not work.

In my experience explicitly permitting the path '/' by adding the directive Allow: / resolves the issue.

More details can be found in a blog post about the issue here: https://www.dfoley.ie/blog/starting-with-the-indieweb
This commit is contained in:
Daithí Seán Ó Foghlú
2019-08-19 03:22:33 +10:00
committed by Andy Miller
parent 20b9ca56fa
commit ed87faad92

View File

@@ -10,3 +10,4 @@ Disallow: /user/
Allow: /user/pages/ Allow: /user/pages/
Allow: /user/themes/ Allow: /user/themes/
Allow: /user/images/ Allow: /user/images/
Allow: /