Discourse AI - Summarize

:bookmark: This topic covers the configuration of the Summarize feature of the Discourse AI plugin.

:person_raising_hand: Required user level: Administrator

Summarize topics and chat channels for a quick recap. Use it in mega topics and large discussions to figure out what’s happening.

Features

  • Summarize topics from topic map (top and bottom of topic)
  • Summarize Chat channels for specific duration of time (up to 7 days)
  • Cached summaries in prior generated topics
  • Regenerate older summaries
  • View summary date and AI model used

Enabling Summarize

Prerequisites

You must configure at least one Large Language Model (LLM) from a provider.

To get started you can configure them through the Discourse AI - Large Language Model (LLM) settings page.

Configuration

  1. Go to Admin settings-> PluginsAISettings tab and make sure its enabled (discourse ai enabled)
  2. Set the LLM to be used through ai summarization model
  3. Checkmark ai summarization enabled to enable Summarize
  4. We recommend setting which groups of users can generate and view summaries through ai custom summarization allowed groups
  5. (Optional) Enable private message (PM) summaries for specific user groups through ai pm summarization allowed groups

Self-hosters will be required to configure the following

  • ai_summarization_discourse_service_api_endpoint
  • ai_summarization_discourse_service_api_key

Technical FAQ

Does Summarize cache results?

  • Summarize does caches results and even makes them available for all users outside of selected user groups.

Caveats

  • Summarize outputs may not be 100% accurate, so make sure to check any output carefully
  • LLM calls can be expensive. We recommend enabling Summarize for specific user groups to help control costs

Last edited by @Saif 2024-11-04T23:37:34Z

Last checked by @hugh 2024-08-06T05:45:39Z

Check documentPerform check on document:
18 Likes

Question

Summarization

Today summarization was activated on the OpenAI forum. In discussing it with one user found out that for moderators we see Summarize this topic with much less content than other users.

As moderators we get the Summarize this topic option

  • For forum - one topic post followed by one reply post will show option
  • For direct message - one topic post will show option

Tried to find documentation on this and found none. Did I miss something?



Feedback on summarization

OMG! That’s perfect!

We need summarization, and we need it now!

(ref)

The summarization topic noted is just above that reply, here.



Suggestion

Automatically drop a summary reply into topics with many replies and many views, especially discussion related topics.

Obviously there needs to be some settings so that it can be customized as to how often and which topics, which may need AI to select.

Also identify the summary repy(s) in such a way that the summarization code does not use any summary reply as that could skew the relevance of info for following summary(s).


Over the past few days as a moderator I have been hand picking topics and adding Discourse AI summaries as replies. The feedback while limited has been positive.

Examples



Crazy suggestion

Create Discourse news site


Every day I visit many sites to get updates. Some of those sites provide news articles about the latest AI trends. Many of those news articles are really just many one liners of facts. One news article I read this morning was pulled from the OpenAI Discourse forum and Discord forums. In using the summarization for that topic noticed that what was generated was just as good or better than the news article.

Now many of us know about Hacker News.

So why not have Discourse news which pulls news based on Discourse sites that agree to have content made available on the Discourse news site. Obviously there would have to be opt-in options for each site, user and topic but you never know, could become a new source of info for The pile and if it also done with the proper approval would be a benefit to many.



Feedback

Created a summary for this topic but did not publicly post

The summary was created after this post (number 5)

Summary

User jy.genre reported the code interpreter being offline with an uploaded image. EricGT responded, saying it was live for them and produced code. They also shared images of the Discourse forum and the OpenAI status page showing a maintenance message, which they had not seen before. They also shared updated maintenance messages. Markanthonykoop also reported seeing a maintenance popup and had experienced the service looping previously. Magejosh confirmed that they couldn’t open new code interpreter chats but could upload files to previously opened chats. They noticed more frequent repeating mistakes and instructions being forgotten, but presumed this was due to maintenance. EricGT shared another maintenance update from the OpenAI status page.

Two observations

  • The time or sequence of the events is important and missing from the summary
  • There is a lot of critical information in the images that is not being extracted for the summary, perhaps OCR can be used.

The original topic is in the Lounge category on OpenAI Discourse which most can not access.
Posted here for Roman and Falco to see and to also save Sam the work or relaying this; Sam is doing great staying in touch on summarization at OpenAI.

I like the reboot thought.

Also consider similar topics that should be merged into one, e.g

2 Likes

14 posts were split to a new topic: Add more language support for AI summaries

Is there a way to clear cached summaries from using other LLMs? I ask since one LLM gave me this beauty of a summary, so I have since switched to another, but I am not certain how to remove this old summary from the forum system.

Currently staff can regenerate summaries after 1h. Other way is using the Ruby console to delete a specific cached summary from the DB.

2 Likes

By the way, we didn’t find the existing summarization models good enough for Discourse, and have switched to using LLMs for this. If you have a server with enough GPU VRAM, running a Llama2 based LLM will get you great results for ai-summarization and ai-helper. I have updated Discourse AI - Self-Hosted Guide with basic instructions about how to run said LLM.

2 Likes

I’m annoyed when people ask this, but can you give a hint what “enough” is?

Maybe ASUS Dual NVIDIA GeForce RTX 3060 V2 OC Edition 12GB GDDR6 Gaming Graphics Card (PCIe 4.0, 12GB GDDR6 Memory, HDMI 2.1, DisplayPort 1.4a, 2-Slot, Axial-tech Fan Design, 0dB Technology) https://a.co/d/8LSAd8A

1 Like

You should take a look at the edit I linked above, it has just that!

But the longer version of it is that quantization allows you to trade-off quality and speed for running on lower specs. If you want to run the best models, without quantization you need around 160GB VRAM. If you accept worse quality, speed, latency, etc you can use 5% of that.

We have “good enough” results running the model I use as an example there, which barely runs in a machine with 96GB of VRAM.

3 Likes

I may be missing it so pardon me if I am, but where is the option to force a fresh regeneration for staff?

Sorry for the bump, but I haven’t been able to find the method for staff to regenerate summaries after 1 hour. Does this 1 hour window require that reply has been made to the topic, is there some secret UI combination that has to be performed to access this, etc.?

Is there any way to use the Anthropic's claude-2 summarization strategy, using AWS Bedrock with Anthropic access (AI Chat Bot has this option)?

Currently, either Anthropic model can only be selected if ai_anthropic_api_key is configured.

1 Like

Yes, you need to

  • Set ai_anthropic_api_key to ‘a’ (to bypass the validation)
  • Fill your bedrock credentials
  • Select claude 2 as the summarization model

We have a planned overhaul of the way models / inference / APIs can be selected in the UI, but this will work for you in the meanwhile.

4 Likes

Feedback.

For the most part the summary is working fine.

For links to images in the post the links are not correct.

e.g.

Following a few photographs shared by [Foxabilo](/t/-/475167/13; /t/-/475167/24; /t/-/475167/34; /t/-/475167/37)

This is part of good ol’ question of languages… but has someone an idea why sometimes summarization respects language of topic and sometimes it uses english? It looks like it happends totally randomly.

And actual question could be is there a system prompt where we can suggest used language or does it come from summarize tool?

This is hard coded internally atm, but we plan to allow flexibility here.

Some people like longer summaries, others like shorter ones … etc…

4 Likes

This text may need an updated as I believe it supports Gemini also.

Additionally, feedback on the settings UI, was there any particular reason that the summarization_strategy and custom summarization allowed groups settings were moved to the Others page instead of being with the Discourse AI page along with the rest of the AI settings? It look a while to find this and it’s getting lost among a set of unrelated settings.

How does one cycle this feature on/off? I’m not seeing the Summarize button for topics with the required number of posts and also seeing messages in error logs so I want to try to turn it off and then on again.

How can we completely disable Summarize with AI?

The quality is really bad and I just find the whole button pointless - I’d like to remove it.

The same way you enabled it with the summarization strategy setting. To disable it you can reset it to the default site setting value.

What model are you using? We found the Claude 3 Opus performs best at this task.