1 million Topics - Takes millions of days to get indexed without Sitemap in Robots

I recently merged my multiple bbpress forums in a single Discourse Forum, Google is indexing it very very slow. It can not go very deep.

From one month i am getting 200 pages per day get indexed, while usually a new content get indexed to 0.1 posts in a month. Its hard for google to index a big forum.
Following three things must be solved so that discourse can be use for multipurpose sites.

  1. dofollow, noindex in header (meta- robots) of the tags pages. Forum with too many tags will create too much Duplicate content in index without this solved.

  2. there must be a way to edit the robots.txt to include sitemaps, or to avoid unnecessary pages get indexed.

  3. There must be a sitemap facility, just like nodebb.

In nodebb all of these three things are possible. but i used discourse as it is more stable in big databases.

No, none of this is true. Site maps are useless and unneeded. Turn off JavaScript and/or set your user agent to Google. See for yourself.

Every customer we have gets indexed fine, even migrations with hundreds of thousands of topics.

(The tags page issue is valid, but already fixed in 1.8 beta by @neil)

6 Likes

Have to agree with @codinghorror every Discourse site I have seen ranks incredibly well in Google Search. Just about every technical topic I search for these days (and I search for a lot), within the top 3 results it’s almost always a site running on Discourse.

3 Likes

Recommend you try out the updates by @vinothkannans to the site map plugin and report back on the plugin thread as to how effective it was.

3 Likes