Yes, the plugin is completely provider agnostic, you may use it with a local LLM out of the box.
Falco
(Falco)
2
Gerelateerde topics
| Topic | Antwoorden | Weergaven | Activiteit | |
|---|---|---|---|---|
| Getting discourse ai to work with ollama locally | 15 | 539 | 6 april 2025 | |
| Discourse AI with local ollama Internal Server Error | 2 | 85 | 28 december 2025 | |
| How to configure Discourse to use a locally installed LLM? | 8 | 282 | 17 september 2025 | |
| Discourse AI plugin: missing model discovery & sensible defaults (any plans or community plugins?) | 4 | 82 | 3 februari 2026 | |
| Create custom LLM plugin or any other option? | 4 | 176 | 25 februari 2025 |