Hello, recently I have noticed that the Topic summary function on my forum has seemingly stopped working entirely, with requests to generate a topic summary hanging indefenitely. It is notable that the other AI tools on the site are working properly such as Summarization Gists and AI Composer Helper.
Below is an image of the Topic summary modal seemingly hanging:
Here is a log entry that appears to be related:
DiscourseAi::Completions::Endpoints::Gemini: status: 400 - body: {
"error": {
"code": 400,
"message": "Invalid value at 'generation_config.response_schema.type' (type.googleapis.com/google.ai.generativelanguage.v1beta.Type), \"json_schema\"\nInvalid JSON payload received. Unknown name \"json_schema\" at 'generation_config.response_schema': Cannot find field.",
"status": "INVALID_ARGUMENT",
"details": [
{
"@type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"field": "generation_config.response_schema.type",
"description": "Invalid value at 'generation_config.response_schema.type' (type.googleapis.com/google.ai.generativelanguage.v1beta.Type), \"json_schema\""
},
{
"field": "generation_config.response_schema",
"description": "Invalid JSON payload received. Unknown name \"json_schema\" at 'generation_config.response_schema': Cannot find field."
}
]
}
]
}
}
Has anyone else encountered this issue or have an idea on how to resolve this? Thanks in advance.
Falco
(Falco)
May 10, 2025, 2:18pm
3
What model are you using?
1 Like
I am currently using Gemini 2.0 Flash, free tier.
1 Like
Falco
(Falco)
May 11, 2025, 11:25pm
5
Can you try setting provider to “OpenAI” and the endpoint to https://generativelanguage.googleapis.com/v1beta/chat/completions
?
1 Like
I have done the steps you gave and I can confirm the LLM is now working as expected. Thanks!
1 Like
sam
(Sam Saffron)
May 12, 2025, 2:12am
7
Very happy it is sorted, but I still think there is a bug here we should sort out @Falco / @Roman_Rizzi
Stuff should maybe fallback if for any reason a model says it will return JSON but does not. This is one area we should eval.
1 Like
Falco
(Falco)
May 12, 2025, 2:47am
8
Oh definitely, I wanted to help isolate it to the Google API instead of the model so we can work on a fix this week.
4 Likes
The Gemini error was fixed in:
main
← structured_output_differences
opened 02:09PM - 15 May 25 UTC
This change fixes two bugs and adds a safeguard.
The first issue is that the … schema Gemini expected differed from the one sent, resulting in 400 errors when performing completions.
The second issue was that creating a new persona won't define a method for `response_format`. This has to be explicitly defined when we wrap it inside the Persona class. Also, There was a mismatch between the default value and what we stored in the DB. Some parts of the code expected symbols as keys and others as strings.
Finally, we add a safeguard when, even if asked to, the model refuses to reply with a valid JSON. In this case, we are making a best-effort to recover and stream the raw response.
It was expecting a slightly different format for the JSON schema. Also, we’ll now treat the completion as plain text if the model doesn’t return valid JSON when asked to.
5 Likes
Roman_Rizzi
(Roman Rizzi)
Closed
May 19, 2025, 11:00am
11
This topic was automatically closed after 2 days. New replies are no longer allowed.