Hey guys,
This is about a selfhosted instance
I´ve activated the Discourse AI Plugin and had a few problems there.
Working in an enterprise environment and AI access is though an internal endpoint with additional authentication.
I implemented an AWS API Gateway endpoint with a Lambda, that takes care of enriching the headers with the required auth info and pass the request on to the OPENAI compatible internal endpoint.
The
LLM Test gives me a
and I thought I´m good to go.
I also see the related calls in my CloudWatch Logs for the API Gateway call.
Then I related the Summarization Personas (Smmmarization and … (short form) with this LLM and activated the Topic Summarization.
What I´m seeing now, though, is not what I wanted to see
And that´s not changing
The strangest thing is, that I don´t see a request in CloudWatch Logs for the attempt to summarize.
Ok, again what I did.
- Activated AI
- Created vLLM with my API Gateway endpoint
- Created 2 Personas (see above) and related them with my LLM config
- Activated Summarization and related there the Persona (tried both)
Not sure what I can try next
Thanks guys and Greetings,
JP