Enabling related topics

:bookmark: This guide explains how to enable and configure the Related Topics feature, which is part of the Embeddings module in the Discourse AI plugin.

:person_raising_hand: Required user level: Administrator

The Related Topics feature helps users discover relevant content by suggesting semantically similar topics based on the one they’re currently reading. This enhances content exploration and increases user engagement.

Summary

  • Enable the Related Topics feature in your Discourse site settings
  • Configure the feature based on your hosting setup (Discourse-hosted or self-hosted)
  • Understand how the feature works and its benefits for your community

Prerequisites

For Discourse-hosted customers

If your site is hosted by Discourse, the Embeddings feature is provided for you using an open-source model. No additional setup is required.

For self-hosted instances

Self-hosted Discourse instances need to provide their own Embeddings through a third-party API key. You can choose from the following options:

  • OpenAI
  • Azure OpenAI
  • Self-hosted Embeddings
  • Cloudflare Workers AI
  • Google Gemini

For detailed instructions on setting up these services, refer to their respective configuration guides.

Enabling and configuring related topics

  1. Go to Admin → Settings → Plugins
  2. Search for “discourse-ai” and ensure it’s enabled
  3. Enable ai_embeddings_enabled to activate the Embeddings module
  4. Enable ai_embeddings_semantic_related_topics_enabled to activate the Related Topics feature

Additional configuration

Depending on your setup, you may need to adjust the following settings:

  • For non-English sites (Discourse-hosted or self-hosted with own model):
    Set ai embeddings model to multilingual-e5-large
  • For Cloudflare Workers AI:
    Set ai embeddings model to bge-large-en
  • For OpenAI or Azure OpenAI:
    Set ai embeddings model to text-embedding-ada-002

How related topics work

When a user visits a topic, Discourse queries the database for the most semantically similar topics based on their embedded representations. These related topics are then presented to the user, encouraging further exploration of the community’s content.

Related topics example

An animated GIF showing what related topics look like at the bottom of a topic.

Features

  • Semantic textual similarity: Goes beyond keyword matching to find truly related content
  • Toggle between “Suggested” and “Related” topics
  • Available for both anonymous and logged-in users

FAQs

Expand to view a diagram of the related topics architecture

The overview is, that when a topic is created / updated this happens:

sequenceDiagram
    User->>Discourse: Creates topic
    Discourse-->>Embedding Microservice: Generates embeddings
    Embedding Microservice-->>Discourse: 
    Discourse-->>PostgreSQL:Store Embeddings 

And during topic visit:

sequenceDiagram
    User->>Discourse: Visits topic
    Discourse-->>PostgreSQL: Query closest topics
    PostgreSQL-->>Discourse: 
    Discourse->>User: Presents related topics 

Q: How is topic/post data processed?
A: For Discourse-hosted sites, data is processed within our secure virtual private datacenter. For self-hosted sites, data processing depends on your chosen third-party provider.

Q: Where is the embeddings data stored?
A: Embeddings data is stored in your Discourse database, alongside other forum data like topics, posts, and users.

Q: What semantic model is used, and how was it trained?
A: Discourse-hosted sites use the all-mpnet-base-v2 model by default. This model performs well for both niche and general communities. Self-hosted sites may use different models depending on their chosen provider.

Additional resources

Last edited by @hugh 2024-08-06T04:30:54Z

Last checked by @hugh 2024-08-06T04:30:59Z

Check documentPerform check on document:
10 Likes

Something worth keeping an eye on.

In reviewing many post in Related Topics for an English site (OpenAI) starting to notice that topics in Spanish tend to be grouped together and suspect that if they were first translated to English each post would have a different vector and thus be clustered with other post. :slightly_smiling_face:



A side benefit of this feature for moderators is to check that the categories of the topics listed in Related Topics are correct.

As I review each new post I also check the Related Topics. This is becoming an effective way to identify topics created with the wrong category.

FYI - A related idea was noted in this feature request.



Find this topic when often needing following link which is not so easy to find so noting here.

1 Like

That behavior is governed by the model, and it appears to be a know problem:

I think the OSS model we recommend for multilingual sites does a better job at this, but we still need to rollout it to more customers to validate this.

2 Likes

It won’t let me enable this option:

Am I missing something here or is Gemini alone not enough?

UPDATE: The instructions and error description may want to be updated to add that the ai embeddings model should also be updated to match the provider otherwise ai_embeddings_enabled can’t be enabled. The parameter description is missing Gemini as an option.

1 Like

7 posts were split to a new topic: “Net::HTTPBadResponse” errors on Gemini Embeddings

What do I fill here pls:

I want to fill the above, because I want to enable the first option among the 4 shown below:

If you use OpenAI, nothing.

1 Like

Then this 1st option (Embeddings Module) troubles me, doesn’t let me enable it:

Most of those are empty. But ai embeddings discourse service api key is your OpenAI API and ai embeddings discourse service api endpoint is https://api.openai.com/v1/embeddings. Model should be text-embedding-3-large (sure, it can be small too but it has some issues).

1 Like

3 posts were split to a new topic: How to get both Suggested and Related topics to display

What were your results from comparing small and large? I know there is a difference in dimensions that affects the model’s precision. The small version is 5x cheaper. Is it really unusable in the real world for topic similarity? Our forum is 99% English.

I’d be very interested in hearing more. Can you please elaborate on where all-mpnet-base-v2 sits in comparison to OpenAI models for a purely English site?

Embeddings are so cheap that price doesn’t matter — unless there is myriad posts when 0.01 cents matter in total costs.

But honestly… I didn’t see any differences. And for me, because there is chance I can’t use RAG and embeds properly, both are equal useless. i know that is badly against public opinion, but on my site that system just doesn’t find and use anything useful.

Propably it comes from OpenAI-models but I don’t have enough money to use those more professional solutions.

1 Like