Local Ollama is not working with the Plugin

In the container, i can use the ollama-service. But in the Discourse-Plugin comes only a “Internal Server Error”.

curl http://172.17.0.1:11434/v1/chat/completions   -H “Content-Type: application/json”   -d ‘{
“model”: “llama3.1:8b”,
“messages”: \[{“role”: “user”, “content”: “Hallo”}\]
}’
{“id”:“chatcmpl-145”,“object”:“chat.completion”,“created”:1760470827,“model”:“llama3.1:8b”,“system_fingerprint”:“fp_ollama”,“choices”:\[{“index”:0,“message”:{“role”:“assistant”,“content”:“Halló! (That’s the Icelandic pronunciation, by the way) or more commonly: Hallo! How can I help you today?”},“finish_reason”:“stop”}\],“usage”:{“prompt_tokens”:11,“completion_tokens”:29,“total_tokens”:40}}

Name: Ollama Llama 3.1 8B
Modell-ID: llama3.1:8b
Anbieter: OpenAI
URL: http://172.17.0.1:11434/v1/chat/completions
Tokenizer: Llama3
Kontextfenster: 32000

Can you go to the page at /logs on your instance and share the error info here please?

Did you run the cURL test from inside the Discourse container or from the outside? It may be a networking issue where you need to make the ollama instance reachable by Discourse.