We’re thrilled to announce that our hosted customers can now power every Discourse AI feature using our own hosted, open-weights LLM[1], pre-configured and included free of charge with your hosting service.
This means every AI feature is available to Starter, Pro, Business, and Enterprise customers without needing any third-party LLM provider.
Enabling the hosted LLM
This should be already available and enabled on your site under the LLM configuration page:
Admin → Plugins → Discourse AI → LLMs
By default, “CDCK Hosted Small LLM” will be selected as your default LLM, and it should also be available for selection by any persona on your site.
Usage and limits
Discourse measures hosted LLM usage through a credit system. Each request or response token consumes 1 credit. Credits are proportional to your hosting tier and are reset daily.
✱ Some features such as AI Spam detection will not be counted towards your credits
Credit limits per tier can be found on the pricing page on our website. After you reach your daily limit, any AI features that rely on the LLM will pause until credits reset the next day.
If you’re running out of credits quickly, consider using the LLM quota system to set per-user or per-group restrictions. Upgrading to a higher tier is another option if you need additional capacity.
You can review your site’s AI usage at any time from the AI usage page in the admin panel.
For more details on what to do when credits are depleted, see:
Large Language Model ↩︎
