This guide covers the AI usage page which is part of the Discourse AI plugin.
Required user level: Administrator
The AI usage page is designed for admins to understand how the community is using Discourse AI features over time. This can be helpful in Estimating costs of using LLMs for Discourse AI.
Features
- Time horizon: last 24 hours, last week, last month and custom date range
- Selectable by Discourse AI feature
- Selectable by enabled LLMs
- Summary preview
- Total requests: All requests made to LLMs through Discourse
- Total tokens: All the tokens used when prompting an LLM
- Request tokens: Tokens used when the LLM tries to understand what you are saying
- Response tokens: Tokens used when the LLM responds to your prompt
- Cache read tokens: Previously processed request tokens that the LLM reuses from cache to optimize performance and cost
- Cache write tokens: Request tokens that were written to cache for potential future reuse
- Estimated cost: Cumulative cost of all tokens used by the LLMs based on specified cost metrics added to LLM configurations
- Interactable bar graph showcasing token usage
- Token, usage count and estimated cost per Discourse AI feature
- Token, usage count and estimated cost per LLM
- Token, usage count and estimated cost per user
Enabling AI usage
Prerequisites
For data to be populated you must configure at least one Large Language Model (LLM) from a provider and have members using the Discourse AI features.
To get started you can configure LLMs through the Discourse AI - Large Language Model (LLM) settings page.
- OpenAI
- Anthropic
- Azure OpenAI
- AWS Bedrock
- Google Gemini
- Mistral
- Groq
- SambaNova
- Open Router
- Cohere
- HuggingFace
- vLLM / Self-Hosting an OpenSource LLM
Configuration
- Go to
Adminsettings->Plugins→AI→Settingstab and make sure its enabled (discourse ai enabled) - Within the AI plugin navigate to the
Usagetab
Technical FAQ
What are tokens and how do they work?
- Tokens are the basic units that LLMs use to understand and generate text, usage data may affect costs. Check out Estimating costs of using LLMs for Discourse AI to learn more.
