If you’re using our CDCK Hosted LLM and you’ve run out of AI credits, don’t worry, you still have a few options to keep your AI features running smoothly.
Why this happens
Each Discourse site comes pre-configured with our CDCK Hosted LLM model that is alloted a daily credit balance based on your hosting plan. Once those credits are consumed, any AI features relying on the hosted LLM will pause until credits reset the following day.
Options
Upgrade your plan for more credits
Upgrading increases your credit pool immediately. This option works best if you rely on the hosted LLM and want predictable usage included in your plan.
If you want to continue using the CDCK Hosted LLM with a higher limit, go to Admin → Dashboard → Upgrade Plan, to upgrade to a higher tier with a larger AI credit allotment.
Connect to a third-party LLM provider
Using your own LLM means CDCK credits are no longer consumed for those features.
If you want to avoid hosted credit limits entirely, you can bring your own API key.
- Go to Admin → Plugins → Discourse AI → LLMs
- Click
Set up using a pre-configured LLM template or using the manual configuration option. - Fill out provider details, API key, and test your preferred model
- Assign this model as your Default LLM or configure it to the specific features you want to power.
For more information about configuring third-party LLMs on Discourse AI, see Discourse AI - Large Language Model (LLM) settings page
Reduce usage by setting per-user quotas
Admins can limit how many tokens each user can consume with AI features. This option is ideal if you want to stay on your current tier but control usage more tightly.
Where to configure it:
- Admin → Plugins → AI → LLMs
- Select Edit on CDCK Hosted LLM
- Press the Add quotas add quotas button to add token limits per user or per group
What this does:
- Puts a hard limit on how much each user can use AI tools per month
- Helps prevent heavy users from consuming the entire credit pool
- Lets you stretch your existing credits without upgrading your plan