but it isn’t clear what the upshot was of this discussion. A noindex directive in robots.txt is no longer effective – Google now ignores it. To keep a private site from appearing in search results entirely, Disallow is not sufficient. What’s needed is the noindex meta tag on every affected page. In our case, this is the login page and any error pages externally reachable (or, for that matter, a noindex meta tag for every page would be just fine.
How can this be accomplished? Thanks.