The more /groups and similar pages (enhanced by themes) move from admin / purely functional views to meaningful landing pages, the more it makes sense to include them in robot.txt and optimize them for search.
What benefit does crawling /groups offer?
They don’t hold any unique information, all of the topics in group activity are already going to be crawled.
With crawlers dedicating finite resources to each site, doesn’t crawling pages which only link to pages linked elsewhere reduce the effectiveness of each crawl?
If the main value of your community is your content, I’d say that’s totally correct. But if the value also come from the groups, categories etc. you offer, I would like to have unique information for these pages so they get valuable for the crawler.
I think it is reasonable to have
https://meta.discourse.org/g in robots, maybe even
https://meta.discourse.org/g/encrypt-users @codinghorror what are your thoughts here?