Why tags are "disallow" in robots.txt?

I realize that today. In the robotx.txt file, tags URL’s are disallowed.

Disallow: /tags
Disallow: /tags/

How i change that?

Search engines don’t like duplication of content. The tag itself does not carry independent information, but serves only for grouping. For the search engine is enough category sections. There have already been many examples of online sanctions for these tags.

10 Likes

Why? Every topic in Discourse that has a Tag belongs to a Category. Category Topics are not blocked in robots.txt

Not blocking the Tag routes to Topics would result in there being duplicate content because there would be more than one path to the exact same content.

True, canonical URLs pointing to the category path would mean that search engines would ignore the tag path. But why give them something to use that they won’t use?

7 Likes

My category system is quite minimal. I don’t use categories and tags with the same name. I think this is not duplication of content if use like that.

There were too many subcategories. I deleted subcategories and return that categories to tags. Using too many categories creates confusion. So I chose to use tag system for that.

Are you under the impression that it’s harming your SEO?

1 Like

It is fair to note that using no / minimal categories and tags is a minor edge condition. But everything is indexed at the topic level regardless. You could use the site map plugin (it is now official) if you are worried.

3 Likes

All of the topics are provided to the search engines under the categories. If you also present them under tags, you’ll be duplicating the content.

You won’t duplicate content, a post by category or by tag will have the same canonical URL, but you will present messy search results, some posts indexed and listed by tag, others by category.

2 Likes

It is possible to override robots.txt in this regard now, if you really want to: /admin/customize/robots.

1 Like