Discourse AI

:discourse2: Summary Integration between AI features and Discourse
:globe_with_meridians: Website Discourse AI Features | Discourse - Civilized Discussion
:hammer_and_wrench: Repository Link GitHub - discourse/discourse-ai
:open_book: Install Guide How to install plugins in Discourse

Discourse AI

Discourse AI is our one-stop solution for integrating Artificial Intelligence and Discourse, enabling both new features and enhancing existing ones.

Discourse AI Features

For Discourse AI, we have opted to keep all features in a single plugin that you can enable independently and allow you to customize them for your community needs.

We’ve also made it one of our priorities not to lock you to a single company API, so every community can pick the provider that makes sense for them. Balancing data privacy, performance, feature sets, and vendor lock-in.

AI Bot

The smart chatbot can answer all questions about your Discourse community and more. The AI Bot has the power to search through your current Discourse instance and beyond with persona support to answer any type of questions you might have.


Summarize topics and chat channels, for times when you need a quick way to catch up or figure out what is going on.

Sentiment Analysis

Sentiment Analysis helps you keep tabs on your community by analyzing posts and providing sentiment and emotional scores to give you an overall sense of your community for any period of time. These insights can help determine the type of users posting within your community and interacting with one another.

AI Helper

AI Helper assists you in your community journey whether you are creating topics/posts or reading along. Including explaining text, proofreading, translating, generating content, and so much more. It’s there to make your thinking even better, so you can save time or use it wisely.


Toxicity can scan both new posts and chat messages and classify them on a toxicity score across a variety of labels, helping you keep toxicity out of your community!


NSFW post flags make sure your community is safe by tagging NSFW image content in posts and chat messages. This helps you manage and identify explicit content that would be inappropriate for your community.


This a module that powers 2 Discourse AI features:

  • Related Topics

    Related Topics help you find the most relevant topics to read next after finishing reading a topic. These topics are recommended using semantic textual similarity between the current topic you are reading and all other topics in your Discourse instance. This results in the discovery of more relevant topics and continued engagement in communities.

  • AI Search

    AI Search helps you find the most relevant topics using semantic textual similarity that are beyond an exact keyword match used by traditional search. This results in the discovery of topics that are non-exact matches but still relevant to the initial search, helping you find what you need.

Configuration + Details

:information_source: We are always adding new functionality and changing our Discourse AI features! Please check each AI feature for all details including configuration, provider support, feature set and more.


:warning: We are being very mindful with our experimentation around AI. The algorithms we are leaning on are only as good as the data they were trained on. Bias, inaccuracies and hallucinations are all possibilities we need to allow for. We regularly revisit, test and refine our AI features.

Self Hosting

Check the docs on self-hosting the API services at Discourse AI - Self-Hosted Guide


Will this be available on Discourse hosting? Which plans?

Discourse AI (basic) → Available for on all plans hosted by Discourse
Discourse AI (advanced) → Available only for Enterprise customers

Please check each AI feature for the latest availability. Rollout for select features for other tiers might follow later.

:partying_face: Updates

Will CDCK offer a SaaS version of the AI services API for self-hosted communities?

Not at the moment, but this is something we may consider given the feedback from our community.

Last edited by @JammyDodger 2024-06-18T14:19:54Z

Check documentPerform check on document:

2 posts were split to a new topic: Managing consumable AI costs

I would like to suggest 2 very useful features: counting the tokens used for users and creating their own personas for the AI Bot


Hi @Oniel :slight_smile:

To make feature requests easier to track and see how popular they are, it’s best to start a fresh feature topic for each suggestion. :+1: If you could also include as much detail about why you think each idea should be developed that also helps strengthen the case for their adoption.


I just see a chatbot in my plugins. Business plan.

1 Like

We are trying to achieve a version of this in the future to calculate LLM costs/token usage

I believe you are talking New AI Persona Editor for Discourse


Two questions:

  • Are there any plans to support Anthropic Claude 3 Sonnet/Opus from AWS Bedrock?
  • Are there any limitations to which AI features of the plugin are supported by AWS Bedrock?

I wish we could support Opus, but sadly AWS Bedrock need to support it first.

Haiku and Sonnet via Bedrock are now supported!

hmm only one I can think of is that I am not sure if there are any embedding models on bedrock that we support @Falco ? But people that host with us already get our open source model and cloudflare/google have free embedding solutions so it is hard to justify. That said the Cohere models are there so we probably want to add that.


Yes sorry, I’m told it will likely release within the month though.

I ask because, at least for now and the forseeable future, we are unable to use anything but our own, owned instance of AWS Bedrock for our AI.

So if I have ai sentiment enabled checked, do I need to set the ai sentiment inference service api endpoint to something for bedrock, or if I leave the default value of https://sentiment-testing.demo-by-discourse.com but enable Bedrock below, will sentiment then be done through Bedrock?

1 Like

We can use Bedrock for any LLM needs in the plugin, like AI Bot, enhancing search with HyDE, AI Triage, Topic summary, Chat title generation, weekly recaps, etc.

Features that depend on non LLM models, like embeddings and sentiment do not work with Bedrock yet, but it’s something we plan on making compatible in the long run.


Where is the current sentiment URL https://sentiment-testing.demo-by-discourse.com hosted? Do we have any limitations to that (on Enterprise) since I see it is a testing/demo URL?

Is there a production URL that we should be on, if not the testing/demo URL?

1 Like

While that URL is a default set to a server we host for people to get the plugin working out of the box, sites hosted by us have the plugin pointed to a server in the same data center where you forum is hosted, the same way we host your database and cache server.


Is the same true for Embeddings?


Yes, exact same.


13 posts were split to a new topic: GPT 3.5 is not configured despite having an API key

Well, that was fast. :smiley:


Thanks. We’re eagerly awaiting that as well.

If we populate an OpenAi key is there a way to know which user is using the most tokens (get a better gauge on cost) and or can we limit usage based on cost per user?

Will the OpenAi key work for all bots and personas? Or do we need different LLMs for different features I’m not 100% sure on that its a little confusing.

Yes, that is stored on the ai_api_audit_logs table.

Yes, all bots and personas.


A post was split to a new topic: Claude 3 Opus tool calling is very verbose