Getting discourse ai to work with ollama locally

I am also trying to get this to work with ollama, it seems that the API that is used from the Discourse AI plugin is not compatible with Ollama and any settings I change do not change the API request.

This is what works with ollama:

curl http://192.168.1.2:11434/api/generate -d '{
                     "model": "llama3.2",
                     "prompt": "Why is the sky blue?"
                   }'

And above seems to be the only type of request the plugin is attempting. i.e.

curl http://192.168.1.2:11434/ \
                           -X POST \
                           -H 'Content-Type: application/json' \
                           -d '{"inputs":"<s>[INST] What is your favourite condiment? [/INST] Well, Im quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever Im cooking up in the kitchen!</s> [INST] Do you have mayonnaise recipes? [/INST]","parameters":{"max_new_tokens":500, "temperature":0.5,"top_p": 0.9}}'

The AI Chat bot supposedly supports ollama but I am not able to get a response out of that either.

So yeah anyone who has gotten this to work with ollama please post what settings you used!

2 Likes