but I did not find a clear cut answer to this question. How can I enable a robots.txt for a Discourse site to indicate that it should not be indexed by any crawlers. This is regarding a private site and there’s no reason for it to have any presence in any search engines. Thanks!
Also, make sure you have the login required setting turned on by rerunning the wizard at discourse.yoursite.com/wizard . That will also shut out search engines.
The site has always required login, and does not permit user self-registration. However, it has still been showing up in search engines – just with no content indexed since it couldn’t log in. My hope is that with robots.txt blocking explicitly, any mention will cease. Thanks.