OpenAI Azure endoint support for gpt-4-32k?


I’m encountering an issue with the OpenAI Azure endpoint. It seems not to be working as expected for the following URL:


Where DEPLOYMENT_NAME is set to gpt-4-32k.

Could you please assist in resolving this issue? Any guidance or suggestions would be greatly appreciated.

Additionally, I have some queries regarding Discourse plugins:

  1. How to Fork a Discourse Plugin: Could you provide detailed instructions or a guide on how to fork an existing Discourse plugin?
  2. Testing the Forked Plugin: Once I have forked a plugin, what are the steps to test this forked version on my own Discourse instance?
  3. Uploading and Activating the Plugin: After testing, how can I upload and activate this forked plugin on my Discourse instance?

Your assistance with these queries would be extremely helpful. Thank you for your time and support.

Best regards,

Did you set that in the setting ai openai gpt4 32k url ?

Yes i did, but the problem seems to be that in the file lib/summarization/entry_point.rb the gpt-4-32k-2023-07-01-preview is not handled at all.

module DiscourseAi
  module Summarization
    class EntryPoint
      def inject_into(plugin)
        foldable_models = [
"gpt-4", max_tokens: 8192),
"gpt-4-32k", max_tokens: 32_768),
"gpt-4-1106-preview", max_tokens: 100_000),
"gpt-3.5-turbo", max_tokens: 4096),
"gpt-3.5-turbo-16k", max_tokens: 16_384),
"claude-2", max_tokens: 200_000),
"claude-instant-1", max_tokens: 100_000),
"Llama2-chat-hf", max_tokens: SiteSetting.ai_hugging_face_token_limit),

            max_tokens: SiteSetting.ai_hugging_face_token_limit,
"gemini-pro", max_tokens: 32_768),
"mistralai/Mixtral-8x7B-Instruct-v0.1", max_tokens: 32_000),

        foldable_models.each do |model|

        truncable_models = [
"long-t5-tglobal-base-16384-book-summary", max_tokens: 16_384),
"bart-large-cnn-samsum", max_tokens: 1024),
"flan-t5-base-samsum", max_tokens: 512),

        truncable_models.each do |model|

We can see this error in the logs:

DiscourseAi::Completions::Endpoints::OpenAi: status: 400 - body: {
  "error": {
    "message": "Unrecognized request argument supplied: tools",
    "type": "invalid_request_error",
    "param": null,
    "code": null

Préparation du payload avec prompt: [{:role=>"system", :content=>"You are a helpful Discourse assistant.\nYou _understand_ and **generate** Discourse Markdown.\nYou live in a Discourse Forum Message.\n\nYou live in the forum with the URL: https://<URL>\nThe title of your site: Discourse\nThe description is: \nThe participants in this conversation are: gpt4_bot, Chris\nThe date now is: 2024-01-19 10:10:05 UTC, much has changed since you were trained.\n\nYou were trained on OLD data, lean on search to get up to date information about this forum\nWhen searching try to SIMPLIFY search terms\nDiscourse search joins all terms with AND. Reduce and simplify terms to find more results."}, {:role=>"user", :content=>"comment faire une boucle en dart ?", :name=>"Chris"}], model_params: {}, dialect: #<DiscourseAi::Completions::Dialects::ChatGpt:0x00007f230513e6e0>

Oh I see, that means you need an updated endpoint with tool support. Azure can handle it, as we are using it internally.

1 Like

No, I don’t think it’s a specificity of Microsoft Azure, I’ve tested the requests in pure JavaScript and it works without any problems, without any notion of “tools”. Here, I have the impression that it’s a problem with the plugin (I specify that I’m not a Ruby developer) but it seems that the abstraction layer used makes things more complicated. In doubt, we forked that plugin and put debuggers everywhere, the URL, the headers that go to Azure are 100% compliant.

DiscourseAi::Completions::Endpoints::OpenAi: status: 400 - body: { “error”: { “message”: “Unrecognized request argument supplied: tools”, “type”: “invalid_request_error”, “param”: null, “code”: null } }

It seems that the only place where we call a “tools” in ./lib/completions/endpoints/open_ai.rb (DiscourseAi::Completions::Endpoints::OpenAi) is:

    def prepare_payload(prompt, model_params, dialect)
      Rails.logger.warn("Preparing payload with prompt: #{prompt}, model_params: #{model_params}, dialect: #{dialect}")
        .merge(messages: prompt)
        .tap do |payload|
          payload[:stream] = true if @streaming_mode
          payload[:tools] = if

which seems to lead us to ./lib/completions/dialects/chat_gpt.rb (payload[:tools] = if

In the message from PEYRUSSE Christian, we realize that the “dialect” is “dialect: #DiscourseAi::Completions::Dialects::ChatGpt:0x00007f230513e6e0

If that can help…

Thank you.

It is not, as long as you are using an up to date endpoint.

We are using Azure endpoints on this site, but with the parameter being api-version=2023-12-01-preview. Can you try using an endpoint with that API version?

We check this out and let you know surely next week, regards.


This morning i have updated the plugins Discourse AI and it works now … that’s great thanks for your help.


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.