While possible, hosting both an entire LLM as well as Discourse on the Pi will most likely leave you with an unsatisfactory performance for both.
Falco
(Falco)
9
נושאים קשורים
| נושא | תגובות | צפיות | פעילות | |
|---|---|---|---|---|
| Discourse on a Raspberry Pi | Blog | 62 | 6550 | 8 באוקטובר, 2024 | |
| Getting discourse ai to work with ollama locally | 15 | 542 | 6 באפריל, 2025 | |
| Self-Hosting an OpenSource LLM for DiscourseAI | 7 | 3616 | 20 בינואר, 2026 | |
| Discourse AI with local ollama Internal Server Error | 2 | 85 | 28 בדצמבר, 2025 | |
| Hosting Discourse on a Raspberry Pi? | 38 | 7433 | 7 בדצמבר, 2021 |