Discourse AI - AI usage

:bookmark: This guide covers the AI usage page which is part of the Discourse AI plugin.

:person_raising_hand: Required user level: Administrator

The AI usage page is designed for admins to understand how the community is using Discourse AI features over time. This can be helpful in Estimating costs of using LLMs for Discourse AI.

Features

  • Time horizon: 24 hours, last week and custom dates
  • Selectable by Discourse AI feature
  • Selectable by enabled LLMs
  • Summary preview
    • Total requests: All requests made to LLMs through Discourse
    • Total tokens: All the tokens used when prompting an LLM
    • Request tokens: Tokens used when the LLM is trying to understand what you are saying
    • Response tokens: Tokens used when the LLM is responding to your prompt
    • Cached tokens: Previously processed request tokens that the LLM uses to optimize performance and cost
  • Interactable bar graph showcasing token usage
  • Token and usage count per Discourse AI feature
  • Token and usage count per LLM

Enabling AI usage

Prerequisites

:information_source: For data to be populated you must configure at least one Large Language Model (LLM) from a provider and have members using the Discourse AI features.

To get started you can configure LLMs through the Discourse AI - Large Language Model (LLM) settings page.

Configuration

  1. Go to Admin settings-> PluginsAISettings tab and make sure its enabled (discourse ai enabled)
  2. Within the AI plugin navigate to the Usage tab

Technical FAQ

What are tokens and how do they work?

Last edited by @Saif 2025-01-23T20:01:28Z

Check documentPerform check on document:
5 Likes