Discourse Chatbot 🤖 (Now smarter than ChatGPT!*)

@merefield - Fantastic work! Just amazing :slight_smile: Thank you for this. In your opinion, would there be any practical use for me to install this on SWAPD? :smiley: And does it affect performance?

2 Likes

Hey David,

Treat it as highly experimental. It wouldn’t be integrated to specific workflows out of the box, but you might find it amusing and may find unexpected applications after testing it out.

The key thing is context. It reads the last x posts or messages and decides how to respond based on that and your final prompt. You can design a system prompt to make it behave “in character”.

Not especially, there’s one lightweight job run for each response.

2 Likes

We are talking about 3 people (with aibot) on conversation. The bot could not send nothing until the others say hello and that will prevent the repeated messages but not limit the expected interaction.

Everyone could send 1on1 message to Aibot and that’s not supposed to be affected with this ajustment.

I’m asking about that because I’m not the only one using the forum and that will prevent a lot of unexpected behavior.

It’s simple for us just to wait the other guys write something before mentioning the bot but not everyone will do it :grimacing:

1 Like

Sorry, I’m still confused with your proposal:

  • If you mention the bot alone in a Topic or Message Channel, the bot will chat
  • As soon as a second person has posted a message or Post, the bot will no longer respond unless targetted.

I deliberately designed it this way to cater for this exact issue.

Is this not a good enough solution?

1 Like
  1. If you talk alone with the bot the messages are repeated and not-sense. It seems to be related to OpenAI, not a big trouble because we can limit 1on1 chat and that’s all.

  2. If you add someone to chat with AIBot in 3 group chat, the same happens if someone mention bot first.

That breaks the conversation and could be avoided. So AIBot waiting for two humans until send a reply could solve that specific issue as a workaround.

Nobody wants to talk now to the bot first in chat but that could be limited to avoid an unexpected behavior until the first issue is solved.

1 Like

So why talk to the bot at all until the second person arrives and begins to chat? Isn’t that the solution?

And in any case the bot would stop replying as soon as the second person has posted a message.

I’m not sure I understand this

You can’t “add” someone to personal chat presently in Discourse?

In main channels, e.g. General, the bot will not speak unless targetted. I’ve just checked and indeed it must be targetted from the word go, as this is classed as not a “Direct Message” type channel.

Test case in point:

Bot didn’t reply as not targetted.

Once targetted it replies:

Also the bot is working fine here 1 to 1, so I’m not seeing your issue with 1 to 1 conversations with the bot in shared areas either?

We probably should take this conversation offline in private chat if you wish to elaborate further.

1 Like

Ah @matenauta I see the problem.

The issue is not the intended rules, it’s a bug.

It’s not working as intended on Direct Messages with two human participants. It is not currently following the rules I set out above. I will take a look.

3 Likes

OK that’s done.

The existing logic was correct, but the user count attribute of a channel is updated by a job and this job is not guaranteed to have run on a timely enough basis for use in this decision, so it is better to calculate it from first principles which I’ve now done.

Actual messages were not necessary for the bot to shut up, only the channel user membership count, but this was incorrect at time of checking, now fixed. Apologies for the confusion.

5 Likes

There might also be the option to run an LLM locally on your server but you’d need a very expensive powerful server well above the minimum spec required to run the actual Discourse, turning your install basically into an LLM service with a Discourse tacked on the side ;). That’s almost certainly going to cost more.

I’m afraid the only practical solution at present is paying for the API use once any free trial period has ended. Much like email.

2 Likes

Hi, I’m grateful for this great plugin, my friend. How can I customize the ChatGPT widget button?

image

2 Likes

CSS, Customize → Text under chatbot and plugin api to swap the icon

3 Likes

Thank you for the great tip. :ok_hand:

2 Likes

Hello. OpenAI answers get interrupted. Is there a setting for this?

1 Like

This came up before. You can try to ask the bot to finish its message or increase the token limit:

chatbot_max_response_tokens

Agree that the bot not working within this limit would appear to be an unhelpful limitation but I do not have control over that.

3 Likes

I have this error:

OpenAIBot: There was a problem: This model’s maximum context length is 4097 tokens. However, you requested 4224 tokens (2224 in the messages, 2000 in the completion). Please reduce the length of the messages or completion.

But not sure how to set length for GPT.
Do you have any instruction for this?

1 Like

Can you just, read up one message and you’ll find your answer???

3 Likes

You have too many characters in the Posts you are sending. Please reduce the look-back setting ( chatbot_max_look_behind), that might help. Ultimately your Posts are getting a bit long!

4 Likes

My chatbot_max_look_behind is 5
What should I set?

1 Like

Actually, you need to reduce chatbot_max_response_tokens.

('However, you requested 4224 tokens (2224 in the messages, 2000 in the completion). Please reduce the length of the messages or completion.

1 Like

My chatbot_max_response_tokens is 2000
And it still errors when I try setting 1500

What should I set with that error?

1 Like