While possible, hosting both an entire LLM as well as Discourse on the Pi will most likely leave you with an unsatisfactory performance for both.
Gerelateerde topics
| Topic | Antwoorden | Weergaven | Activiteit | |
|---|---|---|---|---|
| Discourse on a Raspberry Pi | Blog | 62 | 6390 | 8 oktober 2024 | |
| Getting discourse ai to work with ollama locally | 15 | 480 | 6 april 2025 | |
| Self-Hosting an OpenSource LLM for DiscourseAI | 7 | 3534 | 20 januari 2026 | |
| Discourse AI with local ollama Internal Server Error | 2 | 64 | 28 december 2025 | |
| Hosting Discourse on a Raspberry Pi? | 38 | 7374 | 7 december 2021 |