Google notification to remove "noindex" statements from robots.txt


I just got this email from google

Remove “noindex” statements from the robots.txt of

Is this something known to the community? Do I need to take any action?

Thank you


Looks like Google just sent this out, I got an email too.

1 Like

Same here, posting to keep an eye on opinions etc

1 Like

The same email, do I have to deal with it myself? How to deal with it?

1 Like

Don’t worry, it’s coded in core, the Discourse team will update it and in a few days it will be fixed, we will have nothing to do but upgrade our Discourse

But, for the most impatient, you can now edit the robots.txt :


Google Search Console is sending me messages about our Discourse site:

Remove ‘noindex’ statements from the robots.txt of

To owner of,

Google has identified that your site’s robots.txt file contains the unsupported rule ‘noindex’.

This rule was never officially supported by Google and on 1 September 2019 it will stop working. Please see our help centre to find out how to block pages from the Google index.

1 Like

If you haven’t customized your robots.txt file you won’t need to do anythingdisallow is already doing most of the work.

By default Discourse uses both disallow and noindex in robots.txt.

In the blog post about this update Google suggests using disallow, which we already do. We use noindex in addition to help avoid this linking issue Google mentions (I added emphasis to the relevant bit)…

Disallow in robots.txt: Search engines can only index pages that they know about, so blocking the page from being crawled usually means its content won’t be indexed. While the search engine may also index a URL based on links from other pages, without seeing the content itself, we aim to make such pages less visible in the future.

On our end we’ll look at making an update to add the noindex meta tag or use the X-Robots-Tag header in our HTTP responses to make sure Google’s not indexing the link when it appears on other pages (we’ll update this topic with any changes).

If you’ve added custom noindex rules to robots.txt via your /admin/customize/robots admin page, you should change them to disallow


The presence of noindex in robots.txt was an ill-advised “SEO” change that we were unfortunately convinced to make about a year ago. That change has now been reverted, and the change backported to stable.