Strategies to resolve indexed URL conflicts in Discourse

Hi all,
I’m running a Discourse forum and need help with a major SEO + indexing issue after a rebuild.

What Happened:

  • My original Discourse forum crashed, and I lost around 10,000 topics.

  • I rebuilt the forum from scratch, keeping the same domain and same Google Search Console (GSC) property.

  • Since Discourse uses incremental topic IDs, the newly created topics are now reusing old topic IDs (e.g., /t/783previously belonged to a deleted topic, and now it’s assigned to a new one).

Current Problems:

  1. GSC shows over 12,000 “Crawled - Not Indexed” URLs.

  2. Old topic URLs like /t/old-topic-title/783 are still indexed or being crawled.

  3. These URLs now point to new content (e.g., /t/new-topic-title/783), which causes title mismatches in search and possibly triggers duplicate/thin content penalties.

  4. Some old topic URLs are still being served (not 404 or 410) and are merging with current topics that have the same ID.

  5. The sitemap includes reused IDs, confusing crawlers further.

What should I do to solve it?

In postgres you could manually set the topic ID sequence to a high value, e.g. 20000, so that new topics start at that number:

SELECT setval('topics_id_seq', 20000, false);

You lost the server itself? :cry:

2 Likes

Yup the whole db too.

So I did make a new one, so it already have around 6000 topics so should I from now onwards make it from 20k ?

Ideally you would have done that as soon as you created the new site, but it’s too late now.

Better late than never - if that sequence is currently at 6000 and you set it to 20000, the next new topic will have ID 20000 instead of 6000.

1 Like

So it would be like 1-5999 and then 2000 and so on? Right? And hope this won’t cause any issue on further stages?

Or shall I just leave this and with time new topics would overwrite the old one?

The advice is addressing your complaint that old topic IDs are being “reused”. Bumping that sequence up to 20000 will prevent any topics between 6000 and 19999 from being created.

3 Likes

Yes, let’s say if it even overwrites with time like 6000 and so on with new topic urls this won’t cause any issue, or can it break something in future.