Discourse Chatbot 🤖 (Now smarter than ChatGPT!*)

This is covered in the OP. Yes.

3 Likes

Any chance there could be some settings for this quick chat button. At the moment it appears to always take the user to a personal chat with the bot. I’m not sure I want to encourage lots of hidden personal communication with the bot. Users can go to ChatGPT themselves for that. If we had a setting to either direct user to a group chat channel or to create a new topic in a Category specifically setup for AI Discussions.

I’ve tried setting up a category where users can raise topics and effectively ask the bot questions. Can perhaps we flag categories that perform in this way where the bot will:

  • Always respond to the first post of a topic in that category
  • Continue to respond to posts if there are no other users in the discussion
  • Once other users enter discussion then the bot stops unless specifically replied to or mentioned in the post.

All these are only suggestions / ideas. I really like this plugin!

Oh BTW when trying to enable the summary plugin you mentioned my re-build fails and I have to remove the plugin to get the site up again. (I’ll post this separately on that thread in a bit with details)

2 Likes

Yes you can disable it.

Don’t forget about the quota system.

PR welcome for anything more sophisticated.

Yes, please do, I’m not having the same issue. I just rebuilt a site with it and it went fine.

2 Likes

First community PR merged. Thanks @MarcP! :raised_hands:

3 Likes

so sadddddddddddd =(

1 Like

Well, look on the bright side, you didn’t have to hire me to write the adaptor plugin! I definitely charge more than OpenAI will charge you :wink:

Perhaps you can get your community to help pay for it?

I have a suspicion it won’t be that expensive to run if you keep the user quotas low:

This is what I’ve spent so far building two plugins on OpenAI, well within the free trial quota:

Plus ChatGPT endpoint model is a 10th the cost of text-davinci-003

2 Likes

Simple math, based on default chatbot settings and gpt-3.5-turbo model.

This means if you have 100 “average” high-TL users, who happens to be using their max chatbot requests every week, it would only cost you less than 20 dollars a month.

EDIT: Oops, I forgot to account for input tokens.

2 Likes

The dollar is worth 6 times more than the currency of my country, it is too expensive for me

2 Likes

Do you have any forum for me to see how this gpt chat works? just out of curiosity, I found it so revolutionary, I wanted to test it

Just try ChatGPT:

https://chat.openai.com/chat

Or playground if you want to fine-tune. Keep in mind playground doesn’t have turbo-3.5 (yet?)

Pick up a token, it’s free for ~3 months and $18 of value. Then you can test out the plugin too.

1 Like

I think it’s best to keep the code as simple as possible, to make better enhancement in the feature without too much complications. Current behavior in topics already is like you describe, except it won’t reply to the first post automatically. However you can easily achieve this by adding a topic template for your ask-AIBot category mentioning @AIBot (and you can even add a default prompt to fine-tune the responses!!!)

2 Likes

I wonder if it would help to give the LLM a bit more context - perhaps Category description, or the content of a sticky post?

2 Likes

Yeah, I’ve already thought about the Title being important. We could ship that with every request too.

Good suggestion, I’ll roadmap it.

2 Likes

Also, it would be better if it had context of the usernames, it seems to think all the previous messages are mine (in a group chat with multiple participants).

2 Likes

Yeah, I have implemented that in AI Topic Summary (and it can work really well), but not here. Another candidate for a switch, maybe.

The distinction is made by user/assistant, but I’m not sure of the effect of adding in usernames and what format to do that with here … this needs experimentation I suspect.

Yes it will always be bot said this and user said this. The AI is designed for user-bot interactions. Not being in large groups (even though it can work with that). That being said, theoretically, it’s possible to feed {user} input with multiple names and messages… but would it give better responses?

The AI is as good as the input it gets. Simply adding some usernames is not something that could always improve the responses, it can also generate confused responses. And in general, the username of content is not relevant because the output is based on facts not opinions. So having some doubts on this.

@codergautam Why don’t you experiment on ChatGPT and give it the same input (post content of N posts + adding random usernames above each post?) to see how it will respond in different scenarios. Also I’m curious what responses you are getting that lead to your request?

2 Likes

Just a thought: add the user’s bio from their profile as part of the context. Even just “MarcP joined in October 2019 and has read 11 hours recently”

That is, put a cast of characters into the preamble.

1 Like

I hope your wallet is deep Ed. Every time you add data you are spending money. :). There would also be a token count challenge here at present limits.

4 Likes

Well I asked the AI to summarize what’s going on in this chat, and it thought that all the conversation was from me and it proceeded to give me a lecture about how inappropriate I am for doing this.

1 Like