I have a question regarding the AI plugin. Currently, I only see OpenAI as an option. Is it possible for us to use a local model, such as Ollama, or another program that lets us host our own models? This would help us save on costs since the tokens would be free.
1 Like
Yes, the plugin is completely provider agnostic, you may use it with a local LLM out of the box.
4 Likes
Note if you are looking for free openrouter always have a few models that are free and decent, for example grok 4.1 fast a super competent model is free today (and this changes every few days).
This is fully supported in Discourse AI.
Local models may work but generally are pretty weak so your results will be quite uneven.
1 Like
Not to mention Gemini… IMO the free plan is very generous.
if I can usen local LLM how do I
how do I use gemini on my fourm please