If you look for total category pages indexed of meta.discourse.org with following query in google
site:meta.discourse.org/c/
you will find 1670 pages index. Similarly i have a discourse site with 50 categories
while here more than 23 thousand pages are indexed under category. (A very very dangerous signal for seo because of content duplicacy)
It should be fixed otherwise google will penalise a discourse website rankings.
Look at the following screenshot how duplicate pages are created and indexed.
I think it is more correct to look directly into the search index. I don’t know if there is such a mechanism in the Google search engine. In Yandex there is a detailed report: duplicates, poor content, all types of errors, are excluded and the included page, graphics crawls, etc. According to the results, a lot of duplicate pages.
We used to include canonical meta tag on /c/plugin page but (I think) this was regressed 4 months ago when we introduced “Default Topic List” category setting.
This should now be fixed, and a test case is added to prevent future regression.