Discourse Meta
Kan ik andere LLM's toevoegen?
Ondersteuning
ai
sam
(Sam Saffron)
1 augustus 2024 om 00:01
3
It works with VLLM, you can can just configure a VLLM endpoint and it should work out of the box.
6 likes
Debugging adding new LLM
Rebuilding Discourse after invalid GitHub repo addition
bericht weergeven in topic
Gerelateerde topics
Topic
Antwoorden
Weergaven
Activiteit
DeepSeek provider support? What to do when model provider isn't in "Provider" list?
Support
ai
13
603
5 maart 2025
Debugging adding new LLM
Support
ai
8
216
23 augustus 2024
How to configure Discourse to use a locally installed LLM?
Support
ai
7
133
3 juni 2025
Create custom LLM plugin or any other option?
Support
4
84
25 februari 2025
Self-Hosting an OpenSource LLM for DiscourseAI
Self-Hosting
ai
5
2981
21 februari 2025