Hi all
I have indexed my site to google and get error “No: ‘noindex’ detected in ‘X-Robots-Tag’”.
How to remove this header tag ?
Hi @Younes_Dev, can you check the value of your site’s allow index in robots txt
setting?
That setting is enabled by default. If it is enabled on your site and you are still getting the error, what version of Discourse is your site on? There was an issue related to this on some Discourse versions prior to 2.9, but it has been fixed since then: Search engines now blocked from indexing non-canonical pages.
Hi everyone,
I’m having trouble getting Google to index a specific page on my Discourse forum (https://nhasg.com.vn/g/Sitetor
). Google Search Console reports a “noindex” error, indicating that the page is being blocked from indexing.
Here’s what I’ve tried so far:
- Modified robots.txt: My initial robots.txt file had
Disallow: /g
, which blocked all URLs under/g
. I modified this toAllow: /g/Sitetor
andDisallow: /g/*
to allow indexing of the specific page while blocking others in that directory.
However, even after these changes and submitting the URL in Google Search Console, the page is still not being indexed. I’ve re-checked the robots.txt with Google’s robots.txt tester, and it seems to be configured correctly.
I’m stumped! Could there be another reason why Google is seeing a “noindex” directive? Any help would be greatly appreciated.
Thanks in advance!
I think disallow overrides allow.