Unlock All Discourse AI Features with Our Hosted LLM

Hey @westes,

I hear ya. I can see why the credit limits might feel restrictive at first.

A few things that might help put this in perspective:

Credits replenish daily, so you get a fresh 15K (Starter) or 30K (Pro) every 24 hours. In practice, it’s pretty unlikely you’d see 20 different users all requesting summaries on the same day, especially since we have caching in place. Once a topic is summarized, subsequent users see the cached version without consuming additional credits. But if you genuinely are seeing that level of daily summary usage, that’s actually a great sign! It means your forum is really active and people are engaged.

If usage really is that high, you might be ready for a higher tier. The Business tier (100k credits) would comfortably support a very active community. High AI feature usage usually signals you’re outgrowing your current plan in other ways too.

Before you consider a tier upgrade, per-user quotas can also help. The LLM quota system lets you distribute usage more sustainably across your users, so a few power users don’t exhaust everything early in the day.

If that still isn’t helpful, third-party LLMs might be a better fit for your use case. You can connect your own LLM provider (OpenAI, Anthropic, Gemini, etc.). You’d be paying for it separately, but it gives you more control and might be more economical for high-volume usage.

We’re trying to offer an out-of-the-box option that works well for most customers without the hassle of API keys or separate billing. But you’re not restricted to using it. It’s meant to be helpful, and if it feels too limiting, switching to a third-party provider is totally valid and we support that fully.