Behind admin makes total sense. Move on to bigger better things and I’ll submit a PR, then we can discuss more if need be
PR created, feel free to review and discuss: FEAT: Send user email if admin access is available by rjriel · Pull Request #38 · discourse/discourse-mcp · GitHub
I was having an issue there as well, it’s an interesting caveat to be aware of, maybe important for the docs? If creating an admin key, don’t select “all users”, but instead select “single user” and choose “system”.
How is MCP related to the Discourse-AI plugin? Does the plugin need to be installed and activated in the forum? Since you say I don’t need to ask the admin to do anything, I would expect that it isn’t required. But there is the tag for the Discourse AI Plugin on this topic. (And since you also added it to the new topic today, it doesn’t seem like this was added by accident.)
Perhaps the ai tag is supposed to refer to ‘ai’ in general and not the plugin specifically.
I found it was a bit tricky to set up Discourse MCP on OpenAI Codex CLI, so I wrote a guide for anyone else wanting to do the same thing.
Is it possible to access PDF attachments to discourse posts via MCP?
Thanks for the Discourse MCP, it is great to be able to interact with my various Discourses via LLM!
Having played with it, I have a couple of thoughts around the functionality.
Remote (web) MCP
First up, I think while a local npx-run, stdout-based MCP is useful for power users who are competent in the CLI and have npm installed and are already using Claude Code, opencode, Codex CLI or similar, it won’t really be possible for the majority of Discourse users to use the MCP until it is something that each Discourse instance publishes at a well-known URL.
I wonder if perhaps a plugin-ized version of the MCP could run on the same server as a self-hosted Discourse (perhaps in a separate container like mail-receiver), interacting with the Discourse via API (as does mail-receiver) but also interacting with web-based LLMs like Claude Web / ChatGPT Web via an authenticated web API. This would unlock the MCP feature for non-dev users.
I wanted to check that something similar isn’t in the pipeline already.
Edit an existing post
Various LLMs reported to me while using the MCP that they could create new Topics and Replies, but couldn’t edit existing posts. For Discourse sysadmins, being able to ask an LLM to update a Wiki (for example) would be a super powerful capability.
for this!!!
Ability to edit existing posts/topics is a very useful addition.
In our use case, we use LLMs to maintain KB/Docs categories, so we use local helper scripts to edit existing post/topics.
A Markdown repo with Github Actions is not on option unfortunately. Most of our community moderators and contributors are non-technical people and already familiar with discourse composer.
I added an edit tool to MCP, just update to latest.
Our MCP has support for http transport too, not only stdout. I added that before publishing this blog post even, back in October of last year. So you can run it on a sidecar service anywhere you want!
Is there a guide for a ‘Meta recommended’ way to do this?
Thanks @Falco that’s awesome!
Great work on the MCP.
Any plans to release an http/sse streaming version of the Discourse MCP server so we can add it as a connector to Claude.ai Chat?
We already support HTTP, since this announcement, see two replies above:
I have used it with Claude desktop here:
We just added Data Explorer integration to Discourse MCP, allowing both technical and non-technical users alike to explore the vast data on their Discourse instances. The MCP uses the existing Data Explorer workflow, running read-only queries in the live production database, and is able to create, run, update, and delete any Data Explorer reports. To get started, use the same flow as described in our Discourse MCP is here! blog post, provide an admin API key, and connect it to your favorite LL…
That said, for use with a website, you will need to run the MCP CLI in a web accessible address. When I tested that I used Cloudflare Tunnels for it.