How to Use Ollama With Embeddings

Happy to start another thread if that is more appropriate. Very helpful overall in this thread with getting Ollama up and running thank you!

I am trying to get Ollama working with embeddings as well and despite entering similar settings as mentioned above I get the following error when running the test:

Trying to contact the model returned this error: {"error":{"message":"[] is too short - 'messages'","type":"invalid_request_error","param":null,"code":null}}

What endpoint are you using? The OpenAI compatible is under v1/embeddings and should work just fine.

2 Likes

That was exactly what I was missing, thank you Falco!

1 Like