ups default timeout to 120 seconds (useful given recent problems with Open AI API response times! )
move console logging behind setting
Note that I was already using Rails.logger.debug for some elements, but feedback to console is important to me during my development workflow, so this is now behind a setting, default OFF.
Also I’ve merged this useful PR:
FEATURE: Add fine-tuning to settings (top_p, frequency_penalty & presence_penalty (see OpenAI API)
IMPROVE: Better locales and make the URL’s clickable
FIX: Respect max_tokens also in gpt-3.5-turbo model, this line was not in the prompt before
Why would that be of benefit and give us more than we have now? Can you please provide a more detailed, persuasive, justification?
What do you mean by “a more friendly” openai api?
I’ve already adopted the de-facto standard Ruby bridge to the official Open AI platform API, a solution which is pretty friendly from a developers perspecive and connects us directly to the source service?
IMHO they are trying to manage what people talk, think and share over the world applying direct censorship and deciding what people can or can’t do, trying to push Google without ads plus AI.
I was out ChatGPT because of that and then I register myself without a phone number, using VPN and with an e-mail alias.
But now I validate my first thougts because Microsoft kicks 10.000 employees to invest 10.000 millions into AI and they are breaking all the good thing.
Yeah my intention is to add other bots once there is another obvious good alternative, especially one that doesn’t attempt to nanny.
We have moderation on forums in any case so we can manage the output of less constrained bots.
There are some perfectly reasonable innocent use cases which are currently blocked. For example, a user on the Open AI forum complained the bot was not allowed to simulate fighting which was useless for his intended purpose of using it to create Dungeons & Dragons scenarios :).
The general benefit is that companies get to “do business with Microsoft” vs doing business with a startup. Often, legal places less of an obstacle approving Microsoft because they have SOC2 and ISO certifications and so on.
First such report. I’m simply displaying the entire response, but a token limit setting is present and sent to the API. Try increasing it? I’m afraid there are limits on how much we can control what the bot responds with.
Fine. Is that the same thing as “friendly” though? And that doesn’t pass the common sense test? Azure is surely just passing the same data through or are they actively altering it?
Azure have private copies of the models, heavy investment in open ai. The only company that gets this level of access, it upsets Musk he rants about it
According to the performance difference between Microsoft and ChatGPT, the content output by Microsoft’s ai API is more secure, and I hope my forum users can get more definite answers.
Regarding commerciality, my opinion is that other services of Azure, such as translation services, have been integrated by other plug-ins. I was wondering what is the reason you think the azure API is commercial?
Are you saying that the Azure service is in fact an API to Bing not just GPT-4? Looking at the documentation that is not obvious to me:
This looks to mostly duplicate what Open AI provides and no more.
In any case I intend to add GPT-4 support when I have access, but plan presently only to use OpenAI’s API directly.
See Sam’s point above.
Note, this is free software. I cannot provide unlimited free enhancements. Just like everyone else I have bills to pay. If a commercial entity is interested in funding this I may look at it sooner. And note it’s not just the time it takes me to add features but that I have to support them for the life of the plugin too which gets a greater burden the more complex the solution.
But at least you will need to demonstrate this is not just duplicating the same capabilities of GPT-4.
Increase max tokens. You can somewhat control the length if you give it directions in the system role, e.g. “answer as short as possible” (this is an example, will get you really short replies, most of the times).