Discourse AI

2 posts were split to a new topic: Is it possible to enable only one AI persona?

RAILS_ENV=production bundle exec rake assets:precompile discourse self-hosted version does not work with latest discourse-AI plugin due to scss compile errors. @use “lib/viewport” should be the first line of discourse-ai.css. can you please check ?

Are you running this against an earlier release of Discourse or is it on main?

After installing the Discourse AI plugin on the stable version of Discourse (3.4.6), I realized that Discourse says that #bdef136 (commit dated February 4, 2025) is the latest commit, and that the currently installed plugin is the latest.

However, when I checked again because there was a problem where the key fields required for AWS Bedrock integration were not displayed, I found that it was at least 300 commits behind, and when I actually went to the discourse-ai repository and checked the main branch, the gap was large. This symptom also appears in other Discourse instances that I manage, and I can’t figure out what the problem is.

It hasn’t even been a week since I installed the discourse-ai plugin on my existing instance, but the commit of the version that was actually installed is more than 4-5 months old, and I’m curious about why it’s showing as the latest, and how to fix it so that the plugin can be replaced with the latest commit.

Plugins are pinned after a stable release. That means you won’t automatically get newer plugin updates that might rely on changes in the Discourse core which aren’t part of the stable release yet.
If a plugin update uses code that only exists in the latest tests-passed branch, it could break on stable because that code simply isn’t there.

For example the AI plugin started using the js-diff library, which was only added to core Discourse after the February stable release. So the latest version of the AI plugin won’t work with that stable version.

3 Likes

Oh, I see! So the git clone command in the app.yml file adds more behavior than what was mentioned in that file! Now it makes sense. :open_mouth:

Now, when I activate the AI plugin and add a new LLM in the plugin, apart from being able to select AWS Bedrock, why doesn’t the AWS bedrock access key id and AWS bedrock region input fields appear when I select Bedrock as a provider, unlike what is mentioned in this document: Configuring Amazon Bedrock services

I’m using the Discourse version 3.4.5, and the AI plugin version is 0.0.1
and bdef136.

We do not support Discourse AI on stable Discourse, please use our default release channel for the best experience.

4 Likes

:partying_face: This plugin is now bundled with Discourse core as part of Bundling more popular plugins with Discourse core. If you are self-hosting and use the plugin, you need to remove it from your app.yml before your next upgrade.

4 Likes

I am trying to setup this plugin with Azure. We connect to open ai via our own gateway which uses Azure. I was trying to do the setting using Manual LLM configuration

Processing by DiscourseAi::Admin::AiLlmsController#test as JSON
  Parameters: {"ai_llm"=>{"max_prompt_tokens"=>"2000", "api_key"=>"[FILTERED]", "tokenizer"=>"DiscourseAi::Tokenizer::OpenAiTokenizer", "url"=>"<OUR_URL>", "display_name"=>"test-ai-gateway", "name"=>"gpt-4", "provider"=>"azure", "enabled_chat_bot"=>"true", "vision_enabled"=>"false"}}

This doesn’t seems to work. Can anyone provide the right way of setting this?

My Question:
When using an AI model for full-text translation, I find the results mediocre. I want to switch to another AI model to retranslate the content. How can I retranslate already-translated posts using the new model?

Additionally, is there any efficient way to quickly count the number of posts that have already been translated?

At the moment, you will need to use either the Rails or the PostgreSQL console to delete the existing translations.

It’s in our roadmap to add a status page with per language progress in the translation backlog.

Sorry, I can’t find the specific table to delete. Can you tell me the table name and where is the roadmap of the translation progress?

Hello, will there be an update regarding GPT 5?

1 Like

Hi, there is no official announcement that the Discourse Team said GPT-5 updates or exists at all; but if you really want it, just wait until it releases, so you can check it.

Should land shortly:

Note on GPT-5, I have been using it all day, it is good, but it is a reasoning model so it is slow

Feel free to fiddle with reasoning effort, on low it is acceptable speed, but way slower than 4.1

2 Likes

Pulled the latest update and it looks like OpenAI has made some API changes

 Trying to contact the model returned this error: { “error”: { “message”: “Unsupported parameter: ‘reasoning_effort’. In the Responses API, this parameter has moved to ‘reasoning.effort’. Try again with the new parameter. See the API documentation for more information: https://platform.openai.com/docs/api-reference/responses/create.”, “type”: “invalid_request_error”, “param”: null, “code”: “unsupported_parameter” } }

I have the url set to https://api.openai.com/v1/responses here.

1 Like

It’s working under the completions API. I will follow up with a fix for the responses API.

2 Likes

FYI GPT-5 with Responses is incredibly slow everyone is reporting. So what you have now is probably more useful.

3 Likes

I have pushed a fix for this a couple of hours ago, give it a try.

1 Like

Works like a charm boss :heart_eyes::ok_hand:t2:

1 Like