So, it turns out this is probably a core bug. I understand efforts to resolve are ongoing.
Itās to do with the message responses being unusually long. The solution is to maximise the chat window, and the notification flair goes away. This is not so straightforward a workaround on mobile, of course.
(thanks to @JammyDodger for bringing this to my attention, that saved me a lot of time)
Iām excited to play with this, especially if it would be possible to have a set of different bots that might be limited by specific prompts.
For a hypothetical example:
A bot called QuestionsOnlyBot (a la Whose Lineās Questions Only) that always obeys the prompt, āAlways respond to every response with a question in the style of Wayne Brady.ā
A bot called NewsflashBot (Newsflash) that always obeys the prompt:
Newsflash is a game in which one performer plays a āreporter in the fieldā standing in front of a green screen onto which a video is chroma keyed for the audience. Other performers play news anchors interviewing the reporter, though he does not know what was going on. The anchors provide clues through their questioning, and when the host thinks itās time, the reporter must guess what is happening. You act as the reporter and the user is the news anchor interviewing you. Make sure the user asks you questions about the scene. And always respond in the style of Colin Mochrie.
Or something along those lines.
Would it be possible to:
Have multiple bots?
Constrain the bot(s) with an admin- or staff-provided prompt?
EDIT: on the constraining part, I tried to manually put a prompt into the beginning of the chat and then I think as the chat goes on and ChatGPT loses its memory of what was said before (beyond chatbot max look behind I assume) it forgot the prompt with the initial rules.
Anyone have an idea how to fix the prompt so it always remembers it until itās told to stop using that rule?
No plans to implement multiple bots at present but sounds like a fun idea.
āNewsflashbotā sounds particularly challenging as LLMs are trained on dated data but you might find a way to prompt it to talk about genuine up to the minute news.
I can think of ways of achieving that spontaneously but thatās quite a side alley and quite a lot of work. Separate plugin perhaps?
You might be able to achieve this by āprompt engineeringā but remember the bot has a āsetupā system prompt that is stored in a locale string at present. This is sent every time. Changing this can make the bot behave very differently but it cannot be amended on the fly.
You can edit this string and other prompts in Admin ā Customize ā Text
Changing these can really affect behaviour.
You can try a system prompt like:
āYou are an extreme Formula One fan, you love everything to do with motorsport and its high octane levels of excitementā
To change that on the fly weād need some kind of magic phrase feature perhaps and restrict that to staff?
I doubt Iāll be able to deliver any of this for free. As you might imagine, creating this plugin was a massive distraction from paid work as it is. PR welcome.
You can already achieve staff exclusivity with the current plugin as is. Just donāt permission it for other groups: populate the high trust level only with the staff group. Turn off the button.
I may have been able to figure out how to kind of create something like this with the custom system prompts that you suggested.
If Iām able to create a system prompt that uses {variables} that pull from custom user fields, then I can create one bot that can change based on what text the user has in their profile.
For example, the system prompt could be:
You are a helpful assistant. I want you to speak in %{aibot_language} and talk only about the following topic: %{aibot_topic}.
And then the custom user fields could be aibot_language and aibot_topic and people could put in whichever language and topic they want.
But Iām not sure how easy it would be to have these custom user fields show up as variables in the Customize > Text:
Anyways, if that works, then it may be a good enough workaround for now, avoiding slash commands and multiple bots
@merefield I just disabled it - also the chat feature! This way I can control how much the bot is used, if any⦠(I have a quite small site). But the feature is very neat!
Hi folks! I understand that could be not probably prioritized but I was trying to test the bot into multisite and I get this:
Note if upgrading: The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top le
vel `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all reference
s to `Ruby::OpenAI` to `OpenAI`
Disabling Discourse Chatbot enables our forum again. So please tell me if I can test or give more data related