Unlock All Discourse AI Features with Our Hosted LLM

We’re thrilled to announce that our hosted customers can now power every Discourse AI feature using our own hosted, open-weights LLM[1], pre-configured and included free of charge with your hosting service.

This means every AI feature is available to Starter, Pro, Business, and Enterprise customers without needing any third-party LLM provider.

:gear: Enabling the hosted LLM

This should be already available and enabled on your site under the LLM configuration page:

Admin → Plugins → Discourse AI → LLMs

By default, “CDCK Hosted Small LLM” will be selected as your default LLM, and it should also be available for selection by any persona on your site.

:chart_increasing: Usage and limits

Discourse measures hosted LLM usage through a credit system. Each request or response token consumes 1 credit. Credits are proportional to your hosting tier and are reset daily.

✱ Some features such as AI Spam detection will not be counted towards your credits

Credit limits per tier can be found on the pricing page on our website. After you reach your daily limit, any AI features that rely on the LLM will pause until credits reset the next day.

If you’re running out of credits quickly, consider using the LLM quota system to set per-user or per-group restrictions. Upgrading to a higher tier is another option if you need additional capacity.

You can review your site’s AI usage at any time from the AI usage page in the admin panel.

For more details on what to do when credits are depleted, see:


  1. Large Language Model ↩︎

7 Likes

Is the CDCK Hosted Small LLM available via an API for self-hosted Discourse instances to use? If so, I would presume there would be an associated fee.

2 Likes

No, it is not and we don’t plan on making it so.

For self-hosted instances, going with options such as Gemini free tier or OpenRouter are a great alternative.

7 Likes