|Summary||Uses a remote AI language model to prepare and post a summary of a topic.|
|Repository Link||GitHub - merefield/discourse-ai-topic-summary: Uses a remote AI language model to prepare and post a summary of a Topic|
|Install Guide||How to install plugins in Discourse|
(NB, this is just a UI preview using some horrible random dev fixtures, results are vastly better on real data)
Experimental, but stable.
After a minimum number of Posts, if a Topic lives in an in-scope Category, the plugin will send the topic text to an Open AI Large Language Model with a prompt and post the response, intended to be a summary of the contents which it succeeds in doing pretty well. There is a choice of model.
The summaries are often surprisingly good, occasionally sublime. However when one is not good enough, I’ve added a community downvoting mechanic to force a refresh of poor summaries after a set threshold.
Summaries are in any case re-sought when a set number of additional posts have been made.
You can modify the prompt and the thresholds
IMPORTANT NOTE: the summaries are never exposed to anon so the text will not be crawlable (you may or may not think this is a good thing, but at least the crawlers will only operate on your genuine human data).
EXPERIMENTAL FEATURE: auto-tagging support:
Tired of tagging Topics? Let the AI do it for you! (Relatively intelligently!)
Sometimes the AI gets too creative despite the direction we are giving it here , so you can restrict it to the set of existing ones.
you can quickly create a non-admin user for this purpose from the rails console by using:
rake admin:create (don’t give this user admin priviliges)
(This is a bit of a hacky workaround because the current Discourse “internal API” for tagging currently does not allow you to specify “no new tags”, so the only way of easily preventing this at present is by creating them as a user that does not have that privilege (ie trust level less that
Due to token limits, it’s only good for about 40 Posts max currently, so it will not include material in Posts after a certain point. That will almost certainly change in the future as the models get more powerful and the services more sophisticated.
Rate limits and costs
Retrieving data from Open AI is not free. However, rate limits are implicitly based on Posting. You have control on how many posts it takes before new summaries are sought. This is unlikely to be an issue.
Please take a look through the settings, they should be pretty self explanatory. You will need a token from https://openai.com. The link is also in the settings.
There’s now a Layouts compatible widget which ships with the plugin. You just have to install the Layouts plugin and configure it. Once installed you can turn off the standard top-of-topic summary and rely on the widget in the sidebar.
Disclaimer: I’m not responsible for what the LLM responds with. Please understand the pro’s and con’s of a LLM and what they are and aren’t capable of and their limitations. They are very good at creating convincing, context aware, text but can be factually wrong.
Copyright: Open AI made a statement about Copyright here: Will OpenAI claim copyright over what outputs I generate with the API? | OpenAI Help Center
- Add front & back tests
- Add more user configuration to affect the style of response
- Add GPT-4 support when available
Make the model setting a drop-down list
Add widget support for the Layouts plugin