Discourse Meta
Can I add other LLM?
Support
ai
sam
(Sam Saffron)
August 1, 2024, 12:01am
3
It works with VLLM, you can can just configure a VLLM endpoint and it should work out of the box.
6 Likes
Debugging adding new LLM
Rebuilding Discourse after invalid GitHub repo addition
show post in topic
Related topics
Topic
Replies
Views
Activity
DeepSeek provider support? What to do when model provider isn't in "Provider" list?
Support
ai
13
748
March 5, 2025
Debugging adding new LLM
Support
ai
8
312
August 23, 2024
How to configure Discourse to use a locally installed LLM?
Support
ai
8
195
September 17, 2025
I need help to configure LLM DeepSeek R1 0528 - free for Automatic translations with Discourse AI
Support
content-localization
34
181
January 4, 2026
Create custom LLM plugin or any other option?
Support
4
112
February 25, 2025