Discourse AI

:discourse2: Summary Integration between AI features and Discourse
:globe_with_meridians: Website Discourse AI Features | Discourse - Civilized Discussion
:hammer_and_wrench: Repository Link GitHub - discourse/discourse-ai
:open_book: Install Guide How to install plugins in Discourse

Discourse AI

Discourse AI is our one-stop solution for integrating Artificial Intelligence and Discourse, enabling both new features and enhancing existing ones.

Discourse AI features

For Discourse AI, we have opted to keep most features in a single plugin that you can enable independently and allow you to customize them for your community needs.

We’ve also made it one of our priorities not to lock you to a single company API, so every community can pick the provider that makes sense for them. Balancing data privacy, performance, feature sets, and vendor lock-in.

AI bot

The smart chatbot can answer all questions about your Discourse community and more. The AI Bot has the power to search through your current Discourse instance and beyond with persona support to answer any type of questions you might have.

AI search

AI Search helps you find the most relevant topics using semantic textual similarity that are beyond an exact keyword match used by traditional search. This results in the discovery of topics that are non-exact matches but still relevant to the initial search, helping you find what you need.

Automation + AI

With the Automation plugin, automatically classify your posts and topics via AI triage. Set automation rules and AI triage will analyze posts, performing actions such as hiding, tagging, flagging NSFW, spam or toxic content, and much more. Additionally, you can generate periodic reports such as forum summaries to stay on top of community activity!

Helper

Helper assists you in your community journey whether you are creating topics/posts or reading along. Including explaining text, proofreading, translating, generating content, captions and so much more. It’s designed to enhance user productivity and improve the overall quality of contributions.

Related topics

Related topics help you find the most relevant topics to read next after finishing reading a topic. These topics are recommended using semantic textual similarity between the current topic you are reading and all other topics in your Discourse instance. This results in the discovery of more relevant topics and continued engagement in communities.

Sentiment

Sentiment helps you keep tabs on your community by analyzing posts and providing sentiment and emotional scores to give you an overall sense of your community for any period of time. These insights can help determine the type of users posting within your community and interacting with one another.

Summarize

Summarize topics and chat channels, for times when you need a quick way to catch up or figure out what is going on.

Configuration & details

:information_source: We are always adding new functionality and changing our Discourse AI features! Please check each AI feature for all details including configuration, provider support, feature set, and more.


Disclaimer

:warning: We are being very mindful with our experimentation around AI. The algorithms we are leaning on are only as good as the data they were trained on. Bias, inaccuracies and hallucinations are all possibilities we need to allow for. We regularly revisit, test and refine our AI features.

Self Hosting

Check the docs on self-hosting the API services at Discourse AI - Self-Hosted Guide

FAQ

Will this be available on Discourse hosting? Which plans?

The Discourse AI plugin and all features are now available for customers hosted on Standard, Business and Enterprise plans

:partying_face: Updates

Will CDCK offer a SaaS version of the AI services API for self-hosted communities?

Not at the moment, but this is something we may consider given the feedback from our community.

Last edited by @Saif 2024-11-04T23:38:53Z

Check documentPerform check on document:
64 Likes

2 posts were split to a new topic: Managing consumable AI costs

I would like to suggest 2 very useful features: counting the tokens used for users and creating their own personas for the AI Bot

5 Likes

Hi @Oniel :slight_smile:

To make feature requests easier to track and see how popular they are, it’s best to start a fresh feature topic for each suggestion. :+1: If you could also include as much detail about why you think each idea should be developed that also helps strengthen the case for their adoption.

7 Likes

I just see a chatbot in my plugins. Business plan.

1 Like

We are trying to achieve a version of this in the future to calculate LLM costs/token usage

I believe you are talking New AI Persona Editor for Discourse

2 Likes

Two questions:

  • Are there any plans to support Anthropic Claude 3 Sonnet/Opus from AWS Bedrock?
  • Are there any limitations to which AI features of the plugin are supported by AWS Bedrock?
2 Likes

I wish we could support Opus, but sadly AWS Bedrock need to support it first.

Haiku and Sonnet via Bedrock are now supported!

hmm only one I can think of is that I am not sure if there are any embedding models on bedrock that we support @Falco ? But people that host with us already get our open source model and cloudflare/google have free embedding solutions so it is hard to justify. That said the Cohere models are there so we probably want to add that.

4 Likes

Yes sorry, I’m told it will likely release within the month though.

I ask because, at least for now and the forseeable future, we are unable to use anything but our own, owned instance of AWS Bedrock for our AI.

So if I have ai sentiment enabled checked, do I need to set the ai sentiment inference service api endpoint to something for bedrock, or if I leave the default value of https://sentiment-testing.demo-by-discourse.com but enable Bedrock below, will sentiment then be done through Bedrock?

1 Like

We can use Bedrock for any LLM needs in the plugin, like AI Bot, enhancing search with HyDE, AI Triage, Topic summary, Chat title generation, weekly recaps, etc.

Features that depend on non LLM models, like embeddings and sentiment do not work with Bedrock yet, but it’s something we plan on making compatible in the long run.

6 Likes

Where is the current sentiment URL https://sentiment-testing.demo-by-discourse.com hosted? Do we have any limitations to that (on Enterprise) since I see it is a testing/demo URL?

Is there a production URL that we should be on, if not the testing/demo URL?

1 Like

While that URL is a default set to a server we host for people to get the plugin working out of the box, sites hosted by us have the plugin pointed to a server in the same data center where you forum is hosted, the same way we host your database and cache server.

2 Likes

Is the same true for Embeddings?

2 Likes

Yes, exact same.

3 Likes

13 posts were split to a new topic: GPT 3.5 is not configured despite having an API key

Well, that was fast. :smiley:

4 Likes

Thanks. We’re eagerly awaiting that as well.

If we populate an OpenAi key is there a way to know which user is using the most tokens (get a better gauge on cost) and or can we limit usage based on cost per user?

Will the OpenAi key work for all bots and personas? Or do we need different LLMs for different features I’m not 100% sure on that its a little confusing.

Yes, that is stored on the ai_api_audit_logs table.

Yes, all bots and personas.

4 Likes

A post was split to a new topic: Claude 3 Opus tool calling is very verbose