Discourse Meta
Can I add other LLM?
Support
ai
sam
(Sam Saffron)
August 1, 2024, 12:01am
3
It works with VLLM, you can can just configure a VLLM endpoint and it should work out of the box.
6 Likes
Debugging adding new LLM
Rebuilding Discourse after invalid GitHub repo addition
show post in topic
Related topics
Topic
Replies
Views
Activity
DeepSeek provider support? What to do when model provider isn't in "Provider" list?
Support
ai
13
524
March 5, 2025
Debugging adding new LLM
Support
ai
8
152
August 23, 2024
How to configure Discourse to use a locally installed LLM?
Support
ai
7
104
June 3, 2025
Create custom LLM plugin or any other option?
Support
4
72
February 25, 2025
Discourse AI - Large Language Model (LLM) settings page
Site Management
how-to
,
ai
14
1441
April 11, 2025