Best practices for creating Discourse AI examples

Greetings! I noticed the Discourse AI plugin now includes an Examples section in the admin panel, where you can simulate previous user/model interactions to guide better responses.

Are there any best practices for creating these? Should they be short and focused, or more like full responses?

Also, is this a good place to offload content we might normally put in the persona/system prompt? And how does the model treat examples differently from what’s in the prompt—are they weighted or processed differently?

Just looking to get more out of this feature without bloating the prompt. Appreciate any tips or clarification!

2 Likes

They should follow the same format you expect your outputs to be. If you want short replies in subsequent interactions keep examples short, and vice versa.

No. This is simply appending rounds of conversation that will be prepended to every interaction with that persona.

Depends on the model. Some models, like Gemma, treat the system prompt and the user messages with basically the same weight. While other models give greater weight to the system prompt.

The idea is that your conversations with this persona will be like

system: system prompt goes here
user: example 1
assistant: example 1
user: example 2
assistant: example 2
user: your actual message goes here
assistant:

Where examples work great to help ground the style of assistant messages.

2 Likes