Tikkel
(Digger)
Octobre 14, 2025, 7:43
1
In the container, i can use the ollama-service. But in the Discourse-Plugin comes only a “Internal Server Error”.
curl http://172.17.0.1:11434/v1/chat/completions -H “Content-Type: application/json” -d ‘{
“model”: “llama3.1:8b”,
“messages”: \[{“role”: “user”, “content”: “Hallo”}\]
}’
{“id”:“chatcmpl-145”,“object”:“chat.completion”,“created”:1760470827,“model”:“llama3.1:8b”,“system_fingerprint”:“fp_ollama”,“choices”:\[{“index”:0,“message”:{“role”:“assistant”,“content”:“Halló! (That’s the Icelandic pronunciation, by the way) or more commonly: Hallo! How can I help you today?”},“finish_reason”:“stop”}\],“usage”:{“prompt_tokens”:11,“completion_tokens”:29,“total_tokens”:40}}
Name: Ollama Llama 3.1 8B
Modell-ID: llama3.1:8b
Anbieter: OpenAI
URL: http://172.17.0.1:11434/v1/chat/completions
Tokenizer: Llama3
Kontextfenster: 32000
Falco
(Falco)
Octobre 14, 2025, 8:12
2
Can you go to the page at /logs
on your instance and share the error info here please?
Did you run the cURL
test from inside the Discourse container or from the outside? It may be a networking issue where you need to make the ollama
instance reachable by Discourse.
Tikkel
(Digger)
Octobre 15, 2025, 4:42
3
The test was from inside the Discourse container.
If i test it from the WebUI - i become a “Internal Server Error“.
Here is the log at this moment:
tail -f /var/discourse/shared/standalone/log/rails/production.log
Started GET “/chat/api/me/channels” for 10.233.21.85 at 2025-10-15 04:33:58 +0000
Processing by Chat::Api::CurrentUserChannelsController#index as JSON
Completed 200 OK in 131ms (Views: 0.1ms | ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 75.4ms)
Started GET “/admin/plugins/discourse-ai/ai-llms/test.json?ai_llm%5Bmax_prompt_tokens%5D=32000&ai_llm%5Bmax_output_tokens%5D=&ai_llm%5Bapi_key%5D=[FILTERED]&ai_llm%5Btokenizer%5D=DiscourseAi%3A%3ATokenizer%3A%3ALlama3Tokenizer&ai_llm%5Burl%5D=http%3A%2F%2F172.17.0.1%3A11434%2Fv1%2Fchat%2Fcompletions&ai_llm%5Bdisplay_name%5D=Ollama%20Llama%203.1%208B&ai_llm%5Bname%5D=llama3.1%3A8b&ai_llm%5Bprovider%5D=open_ai&ai_llm%5Benabled_chat_bot%5D=false&ai_llm%5Bvision_enabled%5D=false&ai_llm%5Binput_cost%5D=&ai_llm%5Boutput_cost%5D=&ai_llm%5Bcached_input_cost%5D=&ai_llm%5Bprovider_params%5D%5Borganization%5D=&ai_llm%5Bprovider_params%5D%5Bdisable_native_tools%5D=true&ai_llm%5Bprovider_params%5D%5Bdisable_temperature%5D=true&ai_llm%5Bprovider_params%5D%5Bdisable_top_p%5D=true&ai_llm%5Bprovider_params%5D%5Bdisable_streaming%5D=true&ai_llm%5Bprovider_params%5D%5Benable_responses_api%5D=true&ai_llm%5Bprovider_params%5D%5Breasoning_effort%5D=default” for 10.233.21.85 at 2025-10-15 04:34:01 +0000
Processing by DiscourseAi::Admin::AiLlmsController#test as JSON
Parameters: {“ai_llm”=>{“max_prompt_tokens”=>“32000”, “max_output_tokens”=>“”, “api_key”=>“[FILTERED]”, “tokenizer”=>“DiscourseAi::Tokenizer::Llama3Tokenizer”, “url”=>“``http://172.17.0.1:11434/v1/chat/completions”``, “display_name”=>“Ollama Llama 3.1 8B”, “name”=>“llama3.1:8b”, “provider”=>“open_ai”, “enabled_chat_bot”=>“false”, “vision_enabled”=>“false”, “input_cost”=>“”, “output_cost”=>“”, “cached_input_cost”=>“”, “provider_params”=>{“organization”=>“”, “disable_native_tools”=>“true”, “disable_temperature”=>“true”, “disable_top_p”=>“true”, “disable_streaming”=>“true”, “enable_responses_api”=>“true”, “reasoning_effort”=>“default”}}}
Completed 500 Internal Server Error in 45ms (ActiveRecord: 0.0ms (0 queries, 0 cached) | GC: 39.0ms)
app.yml
templates:
- "templates/postgres.template.yml"
- "templates/redis.template.yml"
- "templates/web.template.yml"
- "templates/web.ratelimited.template.yml"
- "templates/web.ssl.custom.template.yml"
expose:
- "127.0.0.1:8080:80"
- "0.0.0.0:8443:443"
params:
db_default_text_search_config: "pg_catalog.english"
db_shared_buffers: "4096MB"
env:
DISCOURSE_ALLOWED_INTERNAL_HOSTS: "localhost,127.0.0.1,172.17.0.1"
http_proxy: "http://proxy.de:8080"
https_proxy: "http://proxy.de:8080"
no_proxy: "localhost,127.0.0.1,172.17.0.0/16,.firma.de"
ENABLE_SSL: true
DISCOURSE_BASE_URL: "https://forum.firma.de:8443"
DISCOURSE_HOSTNAME: forum.firma.de
DISCOURSE_PORT: 8443
DISCOURSE_CDN_URL: "https://forum.firma.de:8443"
DISCOURSE_FORCE_HTTPS: true
LC_ALL: en_US.UTF-8
LANG: en_US.UTF-8
LANGUAGE: en_US.UTF-8
UNICORN_WORKERS: 8
DISCOURSE_DEVELOPER_EMAILS: 'm.k@firma.de'
DISCOURSE_SMTP_ADDRESS: 10.176.97.14
DISCOURSE_SMTP_PORT: 25
DISCOURSE_SMTP_USER_NAME: ""
DISCOURSE_SMTP_PASSWORD: ""
DISCOURSE_SMTP_ENABLE_START_TLS: false
DISCOURSE_SMTP_DOMAIN: forum.firma.de
DISCOURSE_NOTIFICATION_EMAIL: noreply@forum.firma.de
DISCOURSE_SMTP_OPENSSL_VERIFY_MODE: none
volumes:
- volume:
host: /var/discourse/shared/standalone
guest: /shared
- volume:
host: /var/discourse/shared/standalone/log/var-log
guest: /var/log
hooks:
after_code:
- exec:
cd: $home/plugins
cmd:
- git clone https://github.com/discourse/docker_manager.git
run:
- exec: echo "Beginning of custom commands"
- exec: |
cd /var/www/discourse
su discourse -c 'bundle exec rails runner "
SiteSetting.force_https = true
SiteSetting.port = 8443
Rails.cache.clear
"'
- exec: echo "End of custom commands"