Discourse AI plugin: missing model discovery & sensible defaults (any plans or community plugins?)

Hi everyone :waving_hand:,

First of all, thanks to the Discourse team for building and maintaining the official AI plugin. It’s clear that a lot of thought has gone into making it stable and flexible for different deployment scenarios.

That said, after integrating several AI providers (including OpenAI-compatible gateways and third-party Gemini endpoints), I’ve run into a couple of UX gaps that seem increasingly painful as AI tooling matures. I wanted to ask whether there are plans to address them — or if there’s interest in a community plugin that does.


1. No model discovery / model list from provider

At the moment, when adding a model, the admin must manually:

  • Enter the model ID
  • Know in advance which models are supported by the provider
  • Ensure the ID is spelled exactly correctly

In most modern AI tools and gateways (OpenAI Playground, OpenRouter, OneAPI, LM Studio, etc.), it’s now standard to:

  • Fetch a list of available models from the provider (e.g. /v1/models)
  • Let the user select from a dropdown
  • Optionally show basic capabilities (context length, vision support, etc.)

I understand that Discourse AI supports many non-standard or proxied backends, and that not all providers implement model listing consistently. Still, even an optional “Fetch models from provider” action (best-effort, OpenAI-compatible) would dramatically improve usability for many setups.


2. Context window should not require manual input (or should have a default)

Currently, the context window field has no default value and must be manually entered.

From a user perspective, this feels like something the plugin should either:

  • Default to the model’s known maximum context, or
  • Fall back to a safe, reasonable default if unknown, or
  • Treat an empty value as “use provider/model default”

Requiring admins to manually research and input context sizes is error-prone and unnecessary, especially when the model name already implies this information in most cases.


3. Question: plans, patterns, or community plugins?

So my questions to the community and maintainers are:

  • Are there any plans to improve model discovery and defaults in the official AI plugin?
  • Are there recommended patterns for handling this more ergonomically today?
  • Does anyone know of (or is anyone working on) a community plugin or extension that addresses these gaps?

If the answer is “no, and it’s unlikely to land in core,” I’d seriously consider experimenting with a small companion plugin that focuses purely on:

  • Model discovery
  • Capability metadata
  • Sensible defaults

Before going down that path, I wanted to check whether this is something others are interested in, or if there’s context I might be missing.

Thanks for reading, and I’d love to hear your thoughts.

1 Like

I suspect someone has developed this plugin. Please recommend it to me immediately, thank you.

Model context window size

Maximum context token size for the model. If it is 0, it will be automatically populated from the model metadata (if any), or it can be manually modified.

Is your first point here the same as in this topic?

1 Like