2 posts were split to a new topic: Is it possible to enable only one AI persona?
RAILS_ENV=production bundle exec rake assets:precompile
discourse self-hosted version does not work with latest discourse-AI plugin due to scss compile errors. @use âlib/viewportâ should be the first line of discourse-ai.css. can you please check ?
Are you running this against an earlier release of Discourse or is it on main?
After installing the Discourse AI plugin on the stable version of Discourse (3.4.6), I realized that Discourse says that #bdef136 (commit dated February 4, 2025) is the latest commit, and that the currently installed plugin is the latest.
However, when I checked again because there was a problem where the key fields required for AWS Bedrock integration were not displayed, I found that it was at least 300 commits behind, and when I actually went to the discourse-ai repository and checked the main branch, the gap was large. This symptom also appears in other Discourse instances that I manage, and I canât figure out what the problem is.
It hasnât even been a week since I installed the discourse-ai plugin on my existing instance, but the commit of the version that was actually installed is more than 4-5 months old, and Iâm curious about why itâs showing as the latest, and how to fix it so that the plugin can be replaced with the latest commit.
Plugins are pinned after a stable release. That means you wonât automatically get newer plugin updates that might rely on changes in the Discourse core which arenât part of the stable release yet.
If a plugin update uses code that only exists in the latest tests-passed branch, it could break on stable because that code simply isnât there.
For example the AI plugin started using the js-diff
library, which was only added to core Discourse after the February stable release. So the latest version of the AI plugin wonât work with that stable version.
Oh, I see! So the git clone command in the app.yml file adds more behavior than what was mentioned in that file! Now it makes sense.
Now, when I activate the AI plugin and add a new LLM in the plugin, apart from being able to select AWS Bedrock, why doesnât the AWS bedrock access key id and AWS bedrock region input fields appear when I select Bedrock as a provider, unlike what is mentioned in this document: Configuring Amazon Bedrock services
Iâm using the Discourse version 3.4.5, and the AI plugin version is 0.0.1
and bdef136.
We do not support Discourse AI on stable Discourse, please use our default release channel for the best experience.
This plugin is now bundled with Discourse core as part of Bundling more popular plugins with Discourse core. If you are self-hosting and use the plugin, you need to remove it from your
app.yml
before your next upgrade.
I am trying to setup this plugin with Azure. We connect to open ai via our own gateway which uses Azure. I was trying to do the setting using Manual LLM configuration
Processing by DiscourseAi::Admin::AiLlmsController#test as JSON
Parameters: {"ai_llm"=>{"max_prompt_tokens"=>"2000", "api_key"=>"[FILTERED]", "tokenizer"=>"DiscourseAi::Tokenizer::OpenAiTokenizer", "url"=>"<OUR_URL>", "display_name"=>"test-ai-gateway", "name"=>"gpt-4", "provider"=>"azure", "enabled_chat_bot"=>"true", "vision_enabled"=>"false"}}
This doesnât seems to work. Can anyone provide the right way of setting this?
My Question:
When using an AI model for full-text translation, I find the results mediocre. I want to switch to another AI model to retranslate the content. How can I retranslate already-translated posts using the new model?
Additionally, is there any efficient way to quickly count the number of posts that have already been translated?
At the moment, you will need to use either the Rails or the PostgreSQL console to delete the existing translations.
Itâs in our roadmap to add a status page with per language progress in the translation backlog.
Sorry, I canât find the specific table to delete. Can you tell me the table name and where is the roadmap of the translation progress?
Hello, will there be an update regarding GPT 5?
Hi, there is no official announcement that the Discourse Team said GPT-5 updates or exists at all; but if you really want it, just wait until it releases, so you can check it.
Should land shortly:
Note on GPT-5, I have been using it all day, it is good, but it is a reasoning model so it is slow
Feel free to fiddle with reasoning effort, on low it is acceptable speed, but way slower than 4.1
Pulled the latest update and it looks like OpenAI has made some API changes
Trying to contact the model returned this error: { âerrorâ: { âmessageâ: âUnsupported parameter: âreasoning_effortâ. In the Responses API, this parameter has moved to âreasoning.effortâ. Try again with the new parameter. See the API documentation for more information: https://platform.openai.com/docs/api-reference/responses/create.â, âtypeâ: âinvalid_request_errorâ, âparamâ: null, âcodeâ: âunsupported_parameterâ } }
I have the url set to https://api.openai.com/v1/responses
here.
Itâs working under the completions API. I will follow up with a fix for the responses API.
FYI GPT-5 with Responses is incredibly slow everyone is reporting. So what you have now is probably more useful.
I have pushed a fix for this a couple of hours ago, give it a try.
Works like a charm boss