GPT-4o mini landed - going to be supported?

Title says it all – just curious if there will be a plan to support this model and if so, is there a rough estimate on when it could be implemented?

This is a massive improvement for anyone currently using GPT 3.5 Turbo as it’s smarter and over 60% cheaper.

More info about this model on TechCrunch

1 Like
4 Likes

It already works with the Discourse AI plugin, you just need to make sure to use gpt-4o-mini as the name in your LLM configuration:

image

8 Likes

Well this is embarrassing. Thanks boss :grin::robot:
Perhaps I haven’t played around with it enough - note made

2 Likes

We will see how much smarter it is or much better it is to hallucinate :smirk: I remember how much buzz there was when GPT-4o came and now out there is lot of complains.

But good to try.

Bit off topic, but I don’t totally understand why we have two places to tell model and API, settings and LLM-section?

1 Like

That’s a good point. I suspect the hallucination problems couldn’t have improved with this model however I managed to largely mitigate that by putting a bunch of constraints throughout the system prompt – though that has its own downsides of course.

It makes sense only for the manual LLM set up. So I find myself asking the same question :sweat_smile: