This topic covers the configuration of the Summarize feature of the Discourse AI plugin.
Required user level: Administrator
Summarize topics and chat channels for a quick recap. Use it in mega topics and large discussions to figure out whatโs happening.
Features
- Summarize topics from topic map (top and bottom of topic)
- Summarize Chat channels for specific duration of time (up to 7 days)
- Cached summaries in prior generated topics
- Regenerate older summaries
- View summary date and AI model used
Enabling Summarize
Prerequisites
You must configure at least one Large Language Model (LLM) from a provider.
To get started you can configure them through the Discourse AI - Large Language Model (LLM) settings page.
- OpenAI
- Anthropic
- Azure OpenAI
- AWS Bedrock with Anthropic access
- HuggingFace Endpoints with Llama2-like model
- Self-Hosting an OpenSource LLM
- Google Gemini
- Cohere
Configuration
- Go to
Admin
settings->Plugins
โAI
โSettings
tab and make sure its enabled (discourse ai enabled
) - Set the LLM to be used through
ai summarization model
- Checkmark
ai summarization enabled
to enable Summarize - We recommend setting which groups of users can generate and view summaries through
ai custom summarization allowed groups
- (Optional) Enable private message (PM) summaries for specific user groups through
ai pm summarization allowed groups
Self-hosters will be required to configure the following
ai_summarization_discourse_service_api_endpoint
ai_summarization_discourse_service_api_key
Technical FAQ
Does Summarize cache results?
- Summarize does caches results and even makes them available for all users outside of selected user groups.
Caveats
- Summarize outputs may not be 100% accurate, so make sure to check any output carefully
- LLM calls can be expensive. We recommend enabling Summarize for specific user groups to help control costs
Last edited by @Saif 2024-11-04T23:37:34Z
Last checked by @hugh 2024-08-06T05:45:39Z
Check document
Perform check on document: