Support for prompt customization in DiscourseAI

I have one and only one request and it is always same: you have a prompt for this, so please don’t hide it in the voids of the code. All I need is to add one fraze: always use Finnish. That’s it. And suddenly everything is globally in use.

But now this is just waste of my time, my active users time and for every anon users’ time I get, including those who would need such service.

This is wonderful piece of work. But because of that one missing feature… no. I would like to offer it to my users because if they can use it everyone can use it. But there is absolut no point generate this, summary and so on using english in finnish forum.

5 Likes

Hopefully not before the major language issue is solved :thinking:

We have a plan for prompt customization in DiscourseAI, allowing our users to quickly change all the prompts used in the various parts of the plugin. I believe it’s something we will be able to tackle this year.

In the meantime, we can centralize such requests in this topic.

8 Likes

This is such a pickle! Making prompts customizable is such a hard problem.

Thinking through specifics here lets take:

    CompletionPrompt.seed do |cp|
      cp.id = -306
      cp.name = "explain"
      cp.prompt_type = CompletionPrompt.prompt_types[:text]
      cp.messages = { insts: <<~TEXT }
        You are a tutor explaining a term to a student in a specific context.
        I will provide everything you need to know inside <input> tags, which consists of the term I want you
        to explain inside <term> tags, the context of where it was used inside <context> tags, the title of
        the topic where it was used inside <topic> tags, and optionally, the previous post in the conversation
        in <replyTo> tags.
        Using all this information, write a paragraph with a brief explanation
        of what the term means. Format the response using Markdown. Reply only with the explanation and
        nothing more.
      TEXT
    end

how we can solve this and make customizable?

  1. Use the translation system, move the instructions into server.en.yml
  2. Retire “Completion Prompt” in favor of Ai Persona model
  3. Provide an editor for “Completion Prompt” model
  4. Prompt engineer our way out of it

1. Use the translation system, move the instructions into server.en.yml

pros

  • We will ship automatically with support for multiple languages
  • System already exists, nothing new to build
  • Very customizable via localization overrides in admin UI

cons

  • Many LLMs don’t work well with “not English”, translating may result in far inferior performance
  • If people translate tags then it will be completely broken (eg: <title>)

2. Centralize on AI Persona

pros

  • We already have most of the UI
  • Trivial way to add/remove things to AI helper

cons

  • We will not ship with translations
  • We need to build UI to flag a persona for use by a feature: (is this for illustrate post? is it for composer helper? is it for bot title generator?) some of this are 1-1 and are 1 to many
  • Some of the personas would not make sense for bot conversations, they require <replyTo> tags and <item> tags and <input> tags.

3. Provide a UI for Completion Prompts

pros

  • Clear separation from personas, reduced confusion
  • Nice to be able to add new completion prompts to users … want something below translate, no problems
  • Easy to edit

cons

  • Complex dealing with “drift” - if a user overrides a prompt and we fix the original to add a <something_new> tag, how will they know?

4. Prompt engineer our way out of it

If we manage to pull off this magic, with a bit more custom instructions it could be nice, eg add “IMPORTANT: all replies must be in Finnish”, but getting this to work consistently is going to be hard.

pros

  • magic, just works

cons

  • in reality, very unlikely to work consistently, having custom instructions in the proper language will ground the model far better

Personally I am leaning on (3) here actually. Especially cause it gives us the ability to add custom prompts and has strong separation. But I totally get (2) as a way out, maybe we can pull it off and it unifies a bunch of code which is very nice. We need to play with this and experiment with idea first to decide on a winner.

6 Likes

Perhaps not. But OpenAI works.

We (as end user admins) don’t need enginering whole prompt in these situations. We need only add request to use specific language in prompt. I don’t know anything about coding, but generally speaking adding text string to text is quite trivial, is it not? And you are sending just text here?

And prompt itself must be in english, of course.

2 Likes

it’s specific to the language issue rather than general prompt customization… but if we had a language dropdown, could we ask AI to translate the prompt (except for tags) and then save/send that prompt?

Translating prompt into Finnish AI

:person_curly_hair: translate this into finnish except for the <> tags:

You are a tutor explaining a term to a student in a specific context. I will provide everything you need to know inside <input> tags, which consists of the term I want you to explain inside <term> tags, the context of where it was used inside <context> tags, the title of the topic where it was used inside <topic> tags, and optionally, the previous post in the conversation in <replyTo> tags. Using all this information, write a paragraph with a brief explanation of what the term means. Format the response using Markdown. Reply only with the explanation and nothing more.

:robot: Sinä olet ohjaaja, joka selittää termin oppilaalle tietyssä kontekstissa. Minä tarjoan sinulle kaiken tarvittavan tiedon <input> tageissa, joka sisältää termin, jonka haluat minun selittävän <term> tageissa, kontekstin, jossa sitä käytettiin <context> tageissa, aiheen otsikon, jossa sitä käytettiin <topic> tageissa, ja valinnaisesti, edellisen viestin keskustelussa <replyTo> tageissa. Käyttäen kaikkea tätä tietoa, kirjoita kappale lyhyellä selityksellä siitä, mitä termi tarkoittaa. Muotoile vastaus käyttäen Markdownia. Vastaa vain selityksellä eikä millään muulla.

Could the technical details be a separate part of the prompt? then we have a prompt that’s something like:

Apply this behavior (admins can edit or append):

  • You are a tutor explaining a term to a student in a specific context.
  • Write a paragraphsentence with a brief explanation of what the term means.
  • Reply only with the explanation and nothing more.
  • Use lots of emoji

Using these technical details (uneditable, where we update to avoid drift):

  • the term I want you to explain in inside of <term> tags
  • the context of where it was used inside <context> tags
  • the title of the topic where it was used inside <topic> tags
  • the previous post in the conversation is in <replyTo> tags (optional)
  • format the response using Markdown
1 Like

No, that is really bad idea. It will do two translations changing the prompt every time. All what is needed is phrase ”Answer in [what ever language]”[1]. Your problem is to decide what that language would be and I’d like suggest same what that forum is using as default locale.


  1. or ”use”… that is only matter of phrasing, but it have to be in English. Well, that’s true witk OpenAI, other are uncharted area for me ↩︎

1 Like

I have a temporary solution.

Fork discourse-ai from GitHub and modify the content to speak French.

like this:

Then modify the app.yml file and change the discourse-ai plug-in address to your own repo.

Finally, manually sync your repo from the official repo.

Sorry but please don’t , this pr sends requests to a netlify proxy , I will try some easy fixes today

2 Likes

I have a proof of concept and working example here:

GPT-4 works quite well in most languages, GPT 3.5 is okish, not idea what Gemini pro is talking about it is off in random land.

Claude 2 produces reasonable results as well.

3 Likes

Proofreader

I tested before / after, and at least for GPT-4-Turbo and Portuguese when using the proofreader there is no change at all. I guess the model was smart enough to keep my text in the original language.

AI Image Caption

I extended it to image caption, and it’s working very well there:

5 Likes

Thanks guys! At least captions are now created using wanted language (and quality is expected, but that comes from OpenAI, for me anyway).

Summaries… I don’t know yet because I couldn’t create those a while now. But that is totally another story.

2 Likes