Robots.txt to completely block all indexing of private site

Hi. I’ve seen the discussion at:

but I did not find a clear cut answer to this question. How can I enable a robots.txt for a Discourse site to indicate that it should not be indexed by any crawlers. This is regarding a private site and there’s no reason for it to have any presence in any search engines. Thanks!

1 Like

See the second to last reply in that topic: Needing to edit robots.txt file - where is it?.

5 Likes

Ah, missed that one down there. Perfect. Worked like a charm. Thanks!

4 Likes

Also, make sure you have the login required setting turned on by rerunning the wizard at discourse.yoursite.com/wizard . That will also shut out search engines.

2 Likes

The site has always required login, and does not permit user self-registration. However, it has still been showing up in search engines – just with no content indexed since it couldn’t log in. My hope is that with robots.txt blocking explicitly, any mention will cease. Thanks.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.