Does Discourse AI support third-party relay/proxy APIs (e.g., NewAPI)? Getting “Internal Server Error”

Hi everyone,
I have a question about Discourse AI and whether it supports third-party relay/proxy APIs.

I tried using a relay API from NewAPI (a third-party OpenAI-compatible proxy), but Discourse AI returns an “Internal Server Error.” I’m not sure if this is a configuration issue on my side or a limitation of Discourse AI.

So I’m wondering:

  1. Does Discourse AI currently support OpenAI-compatible third-party / relay / proxy APIs?

  2. Or does it only support official OpenAI and Google APIs right now?

  3. If third-party APIs are supported, is there any special setup required (headers, base URL format, model naming, etc.)?

Using my own official API keys gets expensive, so I’m hoping to use a cheaper relay option if possible.
Also, I’d like to connect Google BananaPro for image generation just for fun — not sure if that’s supported either.

Any pointers or docs would be appreciated. Thanks!

We have customers doing thousands of AI calls daily through OpenAI compatible proxies, so we know it works. The main issue is when they say “OpenAI compatible” is how compatible they really are.

vLLM, Google, Ollama, LMStudio all provide OpenAI compatible APIs that we test and use daily.

Of a specific provider is failing, it’s usually easy to find why via the logs in the /logs page. Can you share the error from there?

1 Like

Thank you for your reply. I will test a few different third-party APIs tomorrow

discourse ai support almost any third-party APIs i know, such as openrouter,newapi

just config llm settings

enter your api base url, api key and model name

1 Like

Thanks for the reply, brother.

image

But this reply is very strange.

Discourse requires filling in the system prompt. I don’t want to fill in the system prompt, I only entered one.

It should be a direct answer to the user’s @ question, the model should just answer directly.

I haven’t figured out how to configure it yet.

When @-ing the bot, for some reason, the default prompt is the “Custom Prompt” prompt (because the format of its reply matches only this “Custom Prompt” prompt, as shown in the figure below). Changing it still has no effect, I don’t know why.

image

you can create a new persona before setting AI helper custom prompt persona

this is an example of my new translator persona(you can config prompt and other options)

then choose your new custom persona in AI helper custom prompt persona