Why would that be of benefit and give us more than we have now? Can you please provide a more detailed, persuasive, justification?
What do you mean by “a more friendly” openai api?
I’ve already adopted the de-facto standard Ruby bridge to the official Open AI platform API, a solution which is pretty friendly from a developers perspecive and connects us directly to the source service?
Yeah my intention is to add other bots once there is another obvious good alternative, especially one that doesn’t attempt to nanny.
We have moderation on forums in any case so we can manage the output of less constrained bots.
There are some perfectly reasonable innocent use cases which are currently blocked. For example, a user on the Open AI forum complained the bot was not allowed to simulate fighting which was useless for his intended purpose of using it to create Dungeons & Dragons scenarios :).
The general benefit is that companies get to “do business with Microsoft” vs doing business with a startup. Often, legal places less of an obstacle approving Microsoft because they have SOC2 and ISO certifications and so on.
First such report. I’m simply displaying the entire response, but a token limit setting is present and sent to the API. Try increasing it? I’m afraid there are limits on how much we can control what the bot responds with.
According to the performance difference between Microsoft and ChatGPT, the content output by Microsoft’s ai API is more secure, and I hope my forum users can get more definite answers.
Regarding commerciality, my opinion is that other services of Azure, such as translation services, have been integrated by other plug-ins. I was wondering what is the reason you think the azure API is commercial?
Are you saying that the Azure service is in fact an API to Bing not just GPT-4? Looking at the documentation that is not obvious to me:
This looks to mostly duplicate what Open AI provides and no more.
In any case I intend to add GPT-4 support when I have access, but plan presently only to use OpenAI’s API directly.
See Sam’s point above.
Note, this is free software. I cannot provide unlimited free enhancements. Just like everyone else I have bills to pay. If a commercial entity is interested in funding this I may look at it sooner. And note it’s not just the time it takes me to add features but that I have to support them for the life of the plugin too which gets a greater burden the more complex the solution.
But at least you will need to demonstrate this is not just duplicating the same capabilities of GPT-4.
Increase max tokens. You can somewhat control the length if you give it directions in the system role, e.g. “answer as short as possible” (this is an example, will get you really short replies, most of the times).