Discourse AI - Summarization

This topic covers the configuration of the Summarization module of the Discourse AI plugin.


This module allows you to use AI-powered summarization strategies when summarizing topics and chat channels. discourse-ai registers these in Discourse’s core new summarization feature, allowing admins to use them via the summarization_strategy site setting.

Available strategies include:

  • A collection of open-source models from HuggingFace: bart-large-cnn-samsum, flan-t5-base-samsum, and long-t5-tglobal-base-16384-book-summary.
  • OpenAI ChatGPT, using either gpt-3.5-turbo or gpt-4.
  • Anthropic Claude’s V1 and V2 models.

Availability

:discourse: Hosted by us? Currently, this module is available for all hosted by Discourse customers on any plan. :tada:

If you’re an Enterprise customer you can contact us to have it added to your site on request. :discourse:

:information_source: Self-hosted users can install the plugin anytime by following Install Plugins in Discourse - sysadmin - Discourse Meta

Configuration

You must configure each model’s settings, or you won’t be able to select it. You’ll see a hint telling you which settings you have to configure first:

The possible options are:

Permissions

Since sending a request to some providers comes with a cost, you can control which users can do it through the custom summarization allowed groups setting (default to staff and trust level 3).

We’ll cache these results and make them available for users not in these groups and anonymous users.

Screenshots

summarizing


Summary from The State of JavaScript on Android in 2015 is... poor

It also lets you do the same for chat channels, allowing you to view a summary of messages sent from the last hours to the previous seven days.

Screenshot 2023-04-24 at 13.06.56

16 Likes

Question

Summarization

Today summarization was activated on the OpenAI forum. In discussing it with one user found out that for moderators we see Summarize this topic with much less content than other users.

As moderators we get the Summarize this topic option

  • For forum - one topic post followed by one reply post will show option
  • For direct message - one topic post will show option

Tried to find documentation on this and found none. Did I miss something?



Feedback on summarization

OMG! That’s perfect!

We need summarization, and we need it now!

(ref)

The summarization topic noted is just above that reply, here.



Suggestion

Automatically drop a summary reply into topics with many replies and many views, especially discussion related topics.

Obviously there needs to be some settings so that it can be customized as to how often and which topics, which may need AI to select.

Also identify the summary repy(s) in such a way that the summarization code does not use any summary reply as that could skew the relevance of info for following summary(s).


Over the past few days as a moderator I have been hand picking topics and adding Discourse AI summaries as replies. The feedback while limited has been positive.

Examples



Crazy suggestion

Create Discourse news site


Every day I visit many sites to get updates. Some of those sites provide news articles about the latest AI trends. Many of those news articles are really just many one liners of facts. One news article I read this morning was pulled from the OpenAI Discourse forum and Discord forums. In using the summarization for that topic noticed that what was generated was just as good or better than the news article.

Now many of us know about Hacker News.

So why not have Discourse news which pulls news based on Discourse sites that agree to have content made available on the Discourse news site. Obviously there would have to be opt-in options for each site, user and topic but you never know, could become a new source of info for The pile and if it also done with the proper approval would be a benefit to many.



Feedback

Created a summary for this topic but did not publicly post

The summary was created after this post (number 5)

Summary

User jy.genre reported the code interpreter being offline with an uploaded image. EricGT responded, saying it was live for them and produced code. They also shared images of the Discourse forum and the OpenAI status page showing a maintenance message, which they had not seen before. They also shared updated maintenance messages. Markanthonykoop also reported seeing a maintenance popup and had experienced the service looping previously. Magejosh confirmed that they couldn’t open new code interpreter chats but could upload files to previously opened chats. They noticed more frequent repeating mistakes and instructions being forgotten, but presumed this was due to maintenance. EricGT shared another maintenance update from the OpenAI status page.

Two observations

  • The time or sequence of the events is important and missing from the summary
  • There is a lot of critical information in the images that is not being extracted for the summary, perhaps OCR can be used.

The original topic is in the Lounge category on OpenAI Discourse which most can not access.
Posted here for Roman and Falco to see and to also save Sam the work or relaying this; Sam is doing great staying in touch on summarization at OpenAI.

I like the reboot thought.

Also consider similar topics that should be merged into one, e.g

2 Likes

14 posts were split to a new topic: Add more language support for AI summaries

Is there a way to clear cached summaries from using other LLMs? I ask since one LLM gave me this beauty of a summary, so I have since switched to another, but I am not certain how to remove this old summary from the forum system.

Currently staff can regenerate summaries after 1h. Other way is using the Ruby console to delete a specific cached summary from the DB.

2 Likes

By the way, we didn’t find the existing summarization models good enough for Discourse, and have switched to using LLMs for this. If you have a server with enough GPU VRAM, running a Llama2 based LLM will get you great results for ai-summarization and ai-helper. I have updated Discourse AI - Self-Hosted Guide with basic instructions about how to run said LLM.

2 Likes

I’m annoyed when people ask this, but can you give a hint what “enough” is?

Maybe ASUS Dual NVIDIA GeForce RTX 3060 V2 OC Edition 12GB GDDR6 Gaming Graphics Card (PCIe 4.0, 12GB GDDR6 Memory, HDMI 2.1, DisplayPort 1.4a, 2-Slot, Axial-tech Fan Design, 0dB Technology) https://a.co/d/8LSAd8A

1 Like

You should take a look at the edit I linked above, it has just that!

But the longer version of it is that quantization allows you to trade-off quality and speed for running on lower specs. If you want to run the best models, without quantization you need around 160GB VRAM. If you accept worse quality, speed, latency, etc you can use 5% of that.

We have “good enough” results running the model I use as an example there, which barely runs in a machine with 96GB of VRAM.

3 Likes

I may be missing it so pardon me if I am, but where is the option to force a fresh regeneration for staff?

Sorry for the bump, but I haven’t been able to find the method for staff to regenerate summaries after 1 hour. Does this 1 hour window require that reply has been made to the topic, is there some secret UI combination that has to be performed to access this, etc.?

Is there any way to use the Anthropic's claude-2 summarization strategy, using AWS Bedrock with Anthropic access (AI Chat Bot has this option)?

Currently, either Anthropic model can only be selected if ai_anthropic_api_key is configured.

1 Like

Yes, you need to

  • Set ai_anthropic_api_key to ‘a’ (to bypass the validation)
  • Fill your bedrock credentials
  • Select claude 2 as the summarization model

We have a planned overhaul of the way models / inference / APIs can be selected in the UI, but this will work for you in the meanwhile.

4 Likes

Feedback.

For the most part the summary is working fine.

For links to images in the post the links are not correct.

e.g.

Following a few photographs shared by [Foxabilo](/t/-/475167/13; /t/-/475167/24; /t/-/475167/34; /t/-/475167/37)

This is part of good ol’ question of languages… but has someone an idea why sometimes summarization respects language of topic and sometimes it uses english? It looks like it happends totally randomly.

And actual question could be is there a system prompt where we can suggest used language or does it come from summarize tool?

This is hard coded internally atm, but we plan to allow flexibility here.

Some people like longer summaries, others like shorter ones … etc…

3 Likes

This text may need an updated as I believe it supports Gemini also.

Additionally, feedback on the settings UI, was there any particular reason that the summarization_strategy and custom summarization allowed groups settings were moved to the Others page instead of being with the Discourse AI page along with the rest of the AI settings? It look a while to find this and it’s getting lost among a set of unrelated settings.

How does one cycle this feature on/off? I’m not seeing the Summarize button for topics with the required number of posts and also seeing messages in error logs so I want to try to turn it off and then on again.

How can we completely disable Summarize with AI?

The quality is really bad and I just find the whole button pointless - I’d like to remove it.

The same way you enabled it with the summarization strategy setting. To disable it you can reset it to the default site setting value.

What model are you using? We found the Claude 3 Opus performs best at this task.