Discourse Chatbot šŸ¤– (supporting ChatGPT)

So, it turns out this is probably a core bug. :beetle: I understand efforts to resolve are ongoing.

It’s to do with the message responses being unusually long. The solution is to maximise the chat window, and :tada: the notification flair goes away. This is not so straightforward a workaround on mobile, of course.

(thanks to @JammyDodger for bringing this to my attention, that saved me a lot of time)

4 Likes

What are we looking now if no-coders like me wants to contribute testing on local?

I will share all my thoughts but maybe you are focused into specific thing that we can steess or try.

The project seems to be very solid now and I have my API keys on-line (using sms4sats for keep my data at home) :ok_hand::+1:

2 Likes

Nothing specific, all but one known issue should be fixed (pending core action), just have a play and enjoy :+1:

3 Likes

I’m excited to play with this, especially if it would be possible to have a set of different bots that might be limited by specific prompts.

For a hypothetical example:

  1. A bot called QuestionsOnlyBot (a la Whose Line’s Questions Only) that always obeys the prompt, ā€œAlways respond to every response with a question in the style of Wayne Brady.ā€

  2. A bot called NewsflashBot (Newsflash) that always obeys the prompt:

    Newsflash is a game in which one performer plays a ā€œreporter in the fieldā€ standing in front of a green screen onto which a video is chroma keyed for the audience. Other performers play news anchors interviewing the reporter, though he does not know what was going on. The anchors provide clues through their questioning, and when the host thinks it’s time, the reporter must guess what is happening. You act as the reporter and the user is the news anchor interviewing you. Make sure the user asks you questions about the scene. And always respond in the style of Colin Mochrie.

Or something along those lines.

Would it be possible to:

  1. Have multiple bots?
  2. Constrain the bot(s) with an admin- or staff-provided prompt?

EDIT: on the constraining part, I tried to manually put a prompt into the beginning of the chat and then I think as the chat goes on and ChatGPT loses its memory of what was said before (beyond chatbot max look behind I assume) it forgot the prompt with the initial rules.

Anyone have an idea how to fix the prompt so it always remembers it until it’s told to stop using that rule?

4 Likes

It might be just me - but where do I signup for that token access?

I see a contact info page anno 2001 Contact sales

But no register place…

Are we back to email corresponding and phone dialog - very AI… :slight_smile:

And if I have to use the contact page - do I choose Api or ChatGPT?

3 Likes

Yeah their website isn’t very easy to navigate imho.

Try this more direct address: https://platform.openai.com/

Or even

You want an API token.

I might even substitute this as the suggested link.

4 Likes

@jimkleiber thanks for the ideas.

No plans to implement multiple bots at present but sounds like a fun idea.

ā€œNewsflashbotā€ sounds particularly challenging as LLMs are trained on dated data but you might find a way to prompt it to talk about genuine up to the minute news.

I can think of ways of achieving that spontaneously but that’s quite a side alley and quite a lot of work. Separate plugin perhaps?

You might be able to achieve this by ā€œprompt engineeringā€ but remember the bot has a ā€œsetupā€ system prompt that is stored in a locale string at present. This is sent every time. Changing this can make the bot behave very differently but it cannot be amended on the fly.

You can edit this string and other prompts in Admin → Customize → Text

Changing these can really affect behaviour.

You can try a system prompt like:

’You are an extreme Formula One fan, you love everything to do with motorsport and its high octane levels of excitement’

To change that on the fly we’d need some kind of magic phrase feature perhaps and restrict that to staff?

I doubt I’ll be able to deliver any of this for free. As you might imagine, creating this plugin was a massive distraction from paid work as it is. PR welcome.

You can already achieve staff exclusivity with the current plugin as is. Just don’t permission it for other groups: populate the high trust level only with the staff group. Turn off the button.

2 Likes

I may have been able to figure out how to kind of create something like this with the custom system prompts that you suggested.

If I’m able to create a system prompt that uses {variables} that pull from custom user fields, then I can create one bot that can change based on what text the user has in their profile.

For example, the system prompt could be:

You are a helpful assistant. I want you to speak in %{aibot_language} and talk only about the following topic: %{aibot_topic}.

And then the custom user fields could be aibot_language and aibot_topic and people could put in whichever language and topic they want.

But I’m not sure how easy it would be to have these custom user fields show up as variables in the Customize > Text:

Anyways, if that works, then it may be a good enough workaround for now, avoiding slash commands and multiple bots :slight_smile:

Thank you for your help!

2 Likes

Its a very cool feature this bot :+1:

I have an issue with the location of the floating icon

Can you reproduce that @merefield ?

Iphone + google chrome

2 Likes

It’s deliberately moved up on mobile to avoid clashing with the Chat UI:

Discourse team got to that location first :slight_smile:

Turn it off if it annoying :wink: Or customize it to your taste.

PR welcome if you can improve it.

4 Likes

@merefield I just disabled it - also the chat feature! This way I can control how much the bot is used, if any… (I have a quite small site). But the feature is very neat!

3 Likes

Thanks, enjoy and have fun customising it!

2 Likes

What about a badge? You did mention @botname

2 Likes

When AIbot mention someone it doesnt use @

Shouldnt that be fixed? @merefield

1 Like

Hi folks! I understand that could be not probably prioritized but I was trying to test the bot into multisite and I get this:

Note if upgrading: The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top le
vel `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all reference
s to `Ruby::OpenAI` to `OpenAI`

Disabling Discourse Chatbot enables our forum again. So please tell me if I can test or give more data related :slight_smile:

2 Likes

That’s not really a ā€˜bug’ to be fixed. You’ve kind of just made a Feature request.

The response is coming from OpenAI. It is up to their model to determine how to phrase a response.

We could scan the response to work out what should be converted.

I’m a little reluctant to start adding ā€œmodificationsā€ to what is sent back, but this is a good suggestion.

PR welcome

2 Likes

ahh I understand! It make sence… AIBot still have something to learn… :slight_smile:

4 Likes

It does.

But you make a valid, good suggestion. I’ve added it to the roadmap (but this doesn’t guarantee I will get around to it soon).

The only tricky part of this is if we convert them and any of your usernames are a common word, we might end up with @ mentioning users incorrectly.

Can you imagine if someone called themselves ā€œandā€?

Perhaps we could restrict our search to only those users in the current Topic or Chat Channel?

This might still fail, but would fail less often and would be faster.

We could hide this behaviour behind a setting for those that aren’t so confident their site’s users usernames are too much like common words.

Thoughts?

2 Likes

Also added this to TODO.

2 Likes

Thanks … I’ll take a look at some point but can’t promise an ETA. Added to Known Issues.

4 Likes