Discourse Meta
Getting discourse ai to work with ollama locally
Support
ai
Falco
(Falco)
February 19, 2025, 7:25pm
6
Set the URL to
http://localhost:11434/v1/chat/completions
1 Like
show post in topic
Related topics
Topic
Replies
Views
Activity
Internal Server Error 500- Manual configuration [ Discourse AI]
Support
ai
8
114
September 5, 2025
Self-Hosting an OpenSource LLM for DiscourseAI
Self-Hosting
ai
5
3068
February 21, 2025
Discourse AI - Self-Hosted Guide
Self-Hosting
ai
61
11981
April 30, 2025
Discourse AI
Plugin
included-in-core
,
ai
,
official
88
36742
October 13, 2025
Create custom LLM plugin or any other option?
Support
4
90
February 25, 2025