I´m working in an enterprise environment and we´re using Discourse as discussion board for supporting an cloud platform.
We want to use the Discourse AI plugin for several usecases and even have internal AI endpoints which are OpenAI compatible.
The thing is, that outgoing requests to these endpoint have to include an authentication header with an oauth2 token coming from a internal m2m auth endpoint which has to be retrieved upfront.
I thought of several approaches, like a local proxy on the ec2 instance hosting discourse, which could enrich the request with that auth information.
Another approach is an API gateway with an authorizer lambda getting the token.
What I didn´t understand so far are the tools, you can add within the Discourse AI plugin itself.
Could that be used to achieve what I have in mind ?
Many thanks for your support and have a great day!
We generally do not like to add too many knobs cause it confuses people but I hear you that this is hard to solve now, we may need another knob.
One option would be to allow the open ai compatible to have a “custom headers” section.
Tools could not easily solve this cause this would create an incredibly complex workflow and we don’t have the ability of easily passing all the information the tool needs.
I guess if custom tools came with enough richness they could accomplish this… it does feel like a bit of a rube goldberg machine but imagine.
IF a configuration with a persona:
Forces tool calls
Has a custom tool forced and it has NO params
THEN we invoke no LLM and simply pass control to the tool
THEN we give the tool enough infra to stream results back to the app via inversion of control in some fashion
Its a pretty staggering amount of change and would end up being an absolute bear to maintain.
I guess an alternative is for you to define a new custom plugin that depends on Discourse-AI and defines your own Dialect and Endpoint - it is certainly the simplest way to go about this.
It is sooooo much easier to solve this specific need via a lightweight proxy, like Nginx with LUA scripting, that I think @Wurzelseppi will be better served going that route.