We’ve ran into another issue using Mistral for embeddings.
- Per this topic use OpenAI as provider and the Mistral service URL as URL
- Select tokenizer, sequence length and distance function
- Set model name to ‘mistral-embed’
When a dimension is entered, Mistral cries about not supporting that
Trying to contact the model returned this error: {{
"object":"error",
"message":{
"detail":[
{
"type":"extra_forbidden",
"loc":[
"body",
"dimensions"
],
"msg":"Extra inputs are not permitted",
"input":2000
}
]
},
"type":"invalid_request_error",
"param":null,
"code":null,
"raw_status_code":422
}
This is because Mistral calls this output_dimension so it is not completely OpenAI compatible.
When I leave out the dimensions parameter, “Run Test” works, but it also prevents me from saving the model, telling me that “dimensions” is a required parameter.
 or when Mistral could be a first-class provider (better).
