Discourse Chatbot 🤖 (Now smarter than ChatGPT!*)

Short update, the AI replies in topics/posts should be better. I ran some tests and think something is still not right but at least it’s acting with knowledge from the topic.

If you want to help testing this, you can do so to test new topics (in a private category, to start with), mention the AI, provide clear information, numbers are simple to test with. Create a chain of max 10 posts with at least 2/3 numbers (default setting = 5 posts history), then ask if it recalls all the numbers mentioned earlier, in my test it completely “forgot” about one number - while it replied to it earlier.

Current known behavior - then AIBot has been mentioned once in a topic, it keeps replying until a third user comes in.

There are still some bugs with chat - which we will hopefully sort out tomorrow. Want to thank @merefield for the time he puts in. Good night for now.


OK I’ve concluded things are stable now and the obvious big bugs are resolved.

Chat support is finally working properly and Post responses now use the correct history when there’s been an explicit reply to a Post and make much more sense!

Thanks to @jimmynewtron and especially @MarcP for their collaboration and critical feedback.


AIBot seems to be broken in one of my toipcs. Not a concern for me but thought I’d report in case it’s useful. I’m pressing the topic reply and it’s only me and aibot that’s in the discussion


OpenAIBot: There was a problem: undefined method user_id’ for nil:NilClass`


/var/www/discourse/plugins/discourse-chatbot/app/jobs/regular/chatbot_reply_job.rb:56:in `rescue in execute'
/var/www/discourse/plugins/discourse-chatbot/app/jobs/regular/chatbot_reply_job.rb:51:in `execute'
/var/www/discourse/app/jobs/base.rb:249:in `block (2 levels) in perform'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/rails_multisite-4.0.1/lib/rails_multisite/connection_management.rb:80:in `with_connection'
/var/www/discourse/app/jobs/base.rb:236:in `block in perform'
/var/www/discourse/app/jobs/base.rb:232:in `each'
/var/www/discourse/app/jobs/base.rb:232:in `perform'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/sidekiq-6.5.8/lib/sidekiq/processor.rb:202:in `execute_job'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/sidekiq-6.5.8/lib/sidekiq/processor.rb:170:in `block (2 levels) in process'
/var/www/discourse/vendor/bundle/ruby/3.2.0/gems/sidekiq-6.5.8/lib/sidekiq/middleware/chain.rb:177:in `block in invoke'

Might be an issue from a post created by a prior version. Can you delete all bot posts on that topic and rerun the conversation?


That fixed it yes- thanks. I just deleted the bot messages. However then when I asked it to summarise the thread it didn’t have anything to summarise. Does the history that gets sent to openai count the deleted messages when selecting the number of messages to include?


The resolved issue might still have an impact as it will include linked Posts virtue of a reply regardless of deleted status.

The best thing would be to test it on a fresh Topic tbh.

Also consider my other plugin which is precisely aimed at the job of summarising a Topic using their most optimal model for that task.


I am somehow able to reproduce something related in a newer topic that had convo only with the AIBot.

OpenAIBot: There was a problem: undefined method `reply_to_post_number' for nil:NilClass

Repro - create N posts (where N is the history in amount of posts you allow the bot to retrieve), then delete N posts and ask the bot for a reply.


The bot is not getting context correctly.

1 Like

Works ok for me:


1 Like

I have another issue: OpenAIBot: There was a problem: undefined method `user_id' for nil:NilClass

1 Like

Hi, curious does this bot ingest the content of the discourse instance?

1 Like

No, token limits preclude that and we are not training a model here. We are just prompting the LLM with the last x Posts within the same Topic.


It seems to work better in chat as well, but not as great on Topics/Posts


Do I have to pay to use the bot on my forum?


This is covered in the OP. Yes.


Any chance there could be some settings for this quick chat button. At the moment it appears to always take the user to a personal chat with the bot. I’m not sure I want to encourage lots of hidden personal communication with the bot. Users can go to ChatGPT themselves for that. If we had a setting to either direct user to a group chat channel or to create a new topic in a Category specifically setup for AI Discussions.

I’ve tried setting up a category where users can raise topics and effectively ask the bot questions. Can perhaps we flag categories that perform in this way where the bot will:

  • Always respond to the first post of a topic in that category
  • Continue to respond to posts if there are no other users in the discussion
  • Once other users enter discussion then the bot stops unless specifically replied to or mentioned in the post.

All these are only suggestions / ideas. I really like this plugin!

Oh BTW when trying to enable the summary plugin you mentioned my re-build fails and I have to remove the plugin to get the site up again. (I’ll post this separately on that thread in a bit with details)


Yes you can disable it.

Don’t forget about the quota system.

PR welcome for anything more sophisticated.

Yes, please do, I’m not having the same issue. I just rebuilt a site with it and it went fine.


First community PR merged. Thanks @MarcP! :raised_hands:


so sadddddddddddd =(

1 Like

Well, look on the bright side, you didn’t have to hire me to write the adaptor plugin! I definitely charge more than OpenAI will charge you :wink:

Perhaps you can get your community to help pay for it?

I have a suspicion it won’t be that expensive to run if you keep the user quotas low:

This is what I’ve spent so far building two plugins on OpenAI, well within the free trial quota:

Plus ChatGPT endpoint model is a 10th the cost of text-davinci-003