Eigene LLM-Plugin erstellen oder eine andere Option wählen?

Hi all,

we´re hosting an discourse board within our corporate network. We can´t integrate ChatGPT or any other public LLM, but there is a big initiative in our company that creates an internal service for AI modules (including GPT).
The have their own API, though, and I wanted to ask if it would be possible to write an custom LLM plugin?
Or what other option do we have ?

The calls to this internal API have to be authenticated, and the API itself needs another API key, so there are several mandantory information that have to be provided upfront.

Any help would be much appreciated.

Thanks and Greetings,

WS

1 „Gefällt mir“

You can use custom models and/or endpoints with the current plugin.

Then, if you need to add more functionality, you can fork and maintain your own plugin as well.

3 „Gefällt mir“

Yes, but have you installed the AI plugin to see what it supports already?

2 „Gefällt mir“

As long as your internal endpoints use a OpenAI type of API interfaces you can use Discourse AI as is and just define a custom LLM.

2 „Gefällt mir“

I´ll try it out. One big concern I´m having, though, is the authentication thing.
It´s not only API key, but also OIDC based (corporate policy stuff :slight_smile: )…

Meaning I would need to somehow customize the way the board makes requests agains this API. Is that possible without forking and doing it myself ?

Thanks folks, you´re awsome time and again :+1: