Robots.txt to completely block all indexing of private site

Hi. I’ve seen the discussion at:

but I did not find a clear cut answer to this question. How can I enable a robots.txt for a Discourse site to indicate that it should not be indexed by any crawlers. This is regarding a private site and there’s no reason for it to have any presence in any search engines. Thanks!

See the second to last reply in that topic: Needing to edit robots.txt file - where is it?.

Ah, missed that one down there. Perfect. Worked like a charm. Thanks!

Also, make sure you have the login required setting turned on by rerunning the wizard at discourse.yoursite.com/wizard . That will also shut out search engines.

The site has always required login, and does not permit user self-registration. However, it has still been showing up in search engines – just with no content indexed since it couldn’t log in. My hope is that with robots.txt blocking explicitly, any mention will cease. Thanks.