On my forum there are categories which require a certain TL level to read.
Google tries to crawl this and gets an error. Should these be automatically excluded by robots.txt?
On my forum there are categories which require a certain TL level to read.
Google tries to crawl this and gets an error. Should these be automatically excluded by robots.txt?
Where did google get the links from, are these topics showing up in a site map?
Hmm. Great question. I see that the canonical URL for threads doesn’t contain the category_id and so it can’t easily be filtered. Assuming it isn’t in sitemaps, if google finds the link elsewhere, there’s no easy block unless you include every individual URL into robots, which is not a sensible approach.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.