No, you see the problem I am beginning to notice with the Discourse API is that with the combination of “sharp” or specific API calls and even a moderate amount of users and you can quickly run to 60 a minute easy.
Let me give you a scenario. We run a support service that is tied to our customers software license. When their license expires, we want to do multiple things with a user in discourse…
We need to find if the user first exists in Discourse. There is a chance that they may not be in Discourse yet. Here is one API call.
If they do exist, we need to move them to another group. We have one group depending on their level of license entitlements. Here is another call.
We want to send the user a message letting them know why their access has been downgraded (because their license expired). This is another call.
Each of these doesn’t seem bad with the rate limit. Now, if I run a script nightly to know which of our users’ licenses expired and it is over 20 (20 * 3 = 60) I am going to hit that rate limit. Now I am not talking thousands of users we are processing, but we can do anywhere from 50-80 a night… right now. We have over 7k users in discourse at the moment but that increases daily.
Now we can get around this limitation in two ways. We can slow down the calls into Discourse, which we do now. Our scripts sleep or we can look for ways to combine our API calls. Obviously if I can call Discourse once and process 50 private messages or move all users to a specific group, this is advantageous. It increases our throughput.
But as you can see, the calls add up. In of themselves it doesn’t seem like an issue. Compound them and they can begin to cripple you. Most APIs I have dealt with have the one off API endpoints, but also have bulk operations for “en mass” manipulations… sometimes for a higher price or business level class.
I am just exploring the options while helping you guys understand the possible need to scale some of these operations up a little.