but it isn’t clear what the upshot was of this discussion. A noindex directive in robots.txt is no longer effective – Google now ignores it. To keep a private site from appearing in search results entirely, Disallow is not sufficient. What’s needed is the noindex meta tag on every affected page. In our case, this is the login page and any error pages externally reachable (or, for that matter, a noindex meta tag for every page would be just fine.
Yes, Google stopped supporting noindex in robots.txt. But their recommendation now appears to be putting a noindex meta tag on all pages that you want to be entirely eliminated from their index. That’s what I’m trying to accomplish.
I haven’t done much theme work, so this is exactly what I needed to know. Thanks! And it’s showing up in the head correctly (added to “head_tag”). For the record, the Google recommended tag if you want to ask all robots not to index, is:
<meta name="robots" content="noindex">
I’ve now disabled (well, actually commented out) robots.txt completely for the reasons noted earlier in this thread. Thanks again.
Glad I could help! I’m still not very good at themes and know nothing about SEO, so I didn’t know what the meta tag actually needed to look like. Glad I got you close enough for you to solve it.