Generic rules in "robots.txt" not picked up by Googlebot

This is correct and intentionally implemented this way.

Therefore Googlebot receives an extra http-header X-Robots-Tag: noindex for pages which really should not be indexed. See:


For your own domains you may use Google Search Console → Inspect URL

Then try to add a user-profile url to index – e.g. https://www.example.com/u/jacob

4 Likes