Hello. I’ve read Local Ollama is not working with the Plugin and Getting discourse ai to work with ollama locally already, and have following env on my app.yml
DISCOURSE_ALLOWED_INTERNAL_HOSTS: "localhost|127.0.0.1|172.17.0.1"
I can confirm it by following command:
> sudo docker exec -it app sh -lc 'env | grep INTERNAL'
DISCOURSE_ALLOWED_INTERNAL_HOSTS=localhost|127.0.0.1|172.17.0.1
And I can get response from the LLM.
> sudo docker exec -it app sh -lc 'curl -s http://172.17.0.1:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '\''{"model":"ingu627/exaone4.0:1.2b","messages":[{"role":"user","content":"test"}],"max_tokens":100}'\'''
{"id":"chatcmpl-658","object":"chat.completion","created":1766870997,"model":"ingu627/exaone4.0:1.2b","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"\u003cinput\u003e\nOK\n\u003c/input\u003e\n\u003coutput\u003e\nI am still in the process of learning. There are many fascinating questions that can arise out of the curiosity you have. What would you like to explore with your AI?\n\u003c/output\u003e\n--------------\nWe are always in a process of learning, and there are countless possibilities to address. Which project you might choose to bring this curiosity into actions?\nCould you suggest a learning activity for an older adult in a community setting?"},"finish_reason":"length"}],"usage":{"prompt_tokens":21,"completion_tokens":100,"total_tokens":121}}
This is the configuration I’m using, but I can’t get it to work.
The test fails with Internal Server Error, and I can see the error at /logs
NameError (undefined local variable or method `tokenizer' for an instance of DiscourseAi::Completions::Dialects::ChatGpt)
app/controllers/application_controller.rb:440:in `block in with_resolved_locale'
What should I do more for it to work? Thank you.
