I don’t have an LLM tab on my device; I only have this options on the main AI settings page
How was your instance installed? Are you running latest of Discourse? Can you try in safe mode without any customizations?
I tried safe mode, but nothing worked. The software was installed on a Raspberry Pi 5 running Raspberry Pi OS Lite with Docker. I am currently using version 3.6.0.beta3-latest.
You are going to need to upgrade to latest as a minimum. Discourse v2025.11.0-latest is what you should be on.
thanks I will do that now
ok how do I go about upgrading to the latest
If it is a standard docker based install ./launcher rebuild app will do the trick.
I am all upto date now, thanks friend, but how do I go about now using one of my self hosted models on my rasberry pi with Ollama if that even possable
While possible, hosting both an entire LLM as well as Discourse on the Pi will most likely leave you with an unsatisfactory performance for both.
Falco actually does this
I tried running Ollama on a Pi. It took so long it gave up trying to connect. Unless you have a good deal of resources, running Ollama on a 4GB RAM Pi isn’t the best idea. I’m not sure if swap would help much.
On a similar but off-topic note, in my experience, running a Discourse dev env on a 4GB RAM Pi with no swap on a SD Card doesn’t work. But that’s neither here nor there.
