Hello, recently I have noticed that the Topic summary function on my forum has seemingly stopped working entirely, with requests to generate a topic summary hanging indefenitely. It is notable that the other AI tools on the site are working properly such as Summarization Gists and AI Composer Helper.
Below is an image of the Topic summary modal seemingly hanging:
Here is a log entry that appears to be related:
DiscourseAi::Completions::Endpoints::Gemini: status: 400 - body: {
"error": {
"code": 400,
"message": "Invalid value at 'generation_config.response_schema.type' (type.googleapis.com/google.ai.generativelanguage.v1beta.Type), \"json_schema\"\nInvalid JSON payload received. Unknown name \"json_schema\" at 'generation_config.response_schema': Cannot find field.",
"status": "INVALID_ARGUMENT",
"details": [
{
"@type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"field": "generation_config.response_schema.type",
"description": "Invalid value at 'generation_config.response_schema.type' (type.googleapis.com/google.ai.generativelanguage.v1beta.Type), \"json_schema\""
},
{
"field": "generation_config.response_schema",
"description": "Invalid JSON payload received. Unknown name \"json_schema\" at 'generation_config.response_schema': Cannot find field."
}
]
}
]
}
}
Has anyone else encountered this issue or have an idea on how to resolve this? Thanks in advance.
Falco
(Falco)
10 مايو 2025، 2:18م
3
What model are you using?
إعجاب واحد (1)
I am currently using Gemini 2.0 Flash, free tier.
إعجاب واحد (1)
Falco
(Falco)
11 مايو 2025، 11:25م
5
Can you try setting provider to “OpenAI” and the endpoint to https://generativelanguage.googleapis.com/v1beta/chat/completions
?
إعجاب واحد (1)
I have done the steps you gave and I can confirm the LLM is now working as expected. Thanks!
إعجاب واحد (1)
sam
(Sam Saffron)
12 مايو 2025، 2:12ص
7
Very happy it is sorted, but I still think there is a bug here we should sort out @Falco / @Roman
Stuff should maybe fallback if for any reason a model says it will return JSON but does not. This is one area we should eval.
إعجاب واحد (1)
Falco
(Falco)
12 مايو 2025، 2:47ص
8
Oh definitely, I wanted to help isolate it to the Google API instead of the model so we can work on a fix this week.
4 إعجابات
Roman
(Roman Rizzi)
16 مايو 2025، 12:24م
10
The Gemini error was fixed in:
main
← structured_output_differences
opened 02:09PM - 15 May 25 UTC
This change fixes two bugs and adds a safeguard.
The first issue is that the … schema Gemini expected differed from the one sent, resulting in 400 errors when performing completions.
The second issue was that creating a new persona won't define a method for `response_format`. This has to be explicitly defined when we wrap it inside the Persona class. Also, There was a mismatch between the default value and what we stored in the DB. Some parts of the code expected symbols as keys and others as strings.
Finally, we add a safeguard when, even if asked to, the model refuses to reply with a valid JSON. In this case, we are making a best-effort to recover and stream the raw response.
It was expecting a slightly different format for the JSON schema. Also, we’ll now treat the completion as plain text if the model doesn’t return valid JSON when asked to.
5 إعجابات
Roman
(Roman Rizzi)
تم إغلاقه في
19 مايو 2025، 11:00ص
11
This topic was automatically closed after 2 days. New replies are no longer allowed.