Yes, the plugin is completely provider agnostic, you may use it with a local LLM out of the box.
Falco
(Falco)
2
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Getting discourse ai to work with ollama locally | 15 | 509 | April 6, 2025 | |
| Discourse AI with local ollama Internal Server Error | 2 | 72 | December 28, 2025 | |
| How to configure Discourse to use a locally installed LLM? | 8 | 273 | September 17, 2025 | |
| Discourse AI plugin: missing model discovery & sensible defaults (any plans or community plugins?) | 4 | 75 | February 3, 2026 | |
| Create custom LLM plugin or any other option? | 4 | 173 | February 25, 2025 |