Options for using an alternative AI provider?

@Falco @sam are there any options if we want to use our own AI provider outside of what are available in the settings? e.g. if we’re hosting our own, if we’re using a service like mendable.ai, etc.

We support both the vLLM and TGI APIs for LLM inference, plus any API that mimics the OpenAI can should work by setting the ai openai gpt4 url pointing to it.


It looks like it’s not quite 1:1 for OpenAI, but it’s very close:

Reading through the API, especially since we need to call newConversation and store state, there would be a few days of work in implementing a dialect/api endpoint for discourse AI.

Totally doable from what I can tell, but it is custom work. We can do it if you wish to sponsor.

Personally I am not sure these custom animals will win long term over frontier models, sadly for mendable they may be just one GPT-5 release away from becoming pretty obsolete.

Even stuff like GPT-4 / Claude Opus today, when honed carefully could probably outperform this.

@Roman_Rizzi is working on allowing you to inject extra external context via files into a persona.

We can already do pretty amazing things with a custom persona today, you could easily adapt the GitHub helper to be sailpoint specific for example: