Trying to contact the model returned this error: { "error": { "message": "Unsupported value: 'messages[0].role' does not support 'system' with this model.", "type": "invalid_request_error", "param": "messages[0].role", "code": "unsupported_value" } }
It seems these new models do not support a system role. o1-mini is 80% cheaper then the top model, might be useful for complex tasks.
These models aren’t available in the API for your account while we’re in this short beta period. (Developers on usage tier 5 will have access, but we’ll expand access to more tiers.) We’re continuing to improve o1 and we’ll be in touch as soon as it’s available to you in the API.
I have to admit 4o has been a lifesaver for me. With this being an additional 80% on top of the current, it will work a lot better than paying ChatGPT £18.99 per month.
I use this manually for adding scripts of CCS onto the forum and making my own designs. It is incredible what it can do.
But it is also dangerous for the developers that it can overcome their work.
While I understand the reason for adding or streaming many may not. It might be of benefit to those here to have a small explanation of why streaming was noted.
We can expect this model to remain quite expensive even well after full API release, so hopefully we can get some user usage limits for AI personas implemented
Discourse uses the streaming feature (printing the response as soon as it comes in, similar to ChatGPT webUI) to show the response in real-time. o1 models do not support this, so Discourse needs to change their logic to work with non-streaming models.
One interesting use for o1 might be as a custom tool, to do all the chatting with GPT-4 and cheaper models, then have the model reach out to o1 in cases where it is clear it needs some custom reasoning.