Private message without seeing other targeted users using API

I did a search and looked around a bit but didn’t quite see quite like what I am looking for. I know you can private message users through the API and you can specify several target usernames as part of that which works as expected. However, from what I can tell anyone who is targeted with the private message can see the other users who were also targeted at the same time.

For instance if I message “User1” and “User2” as part of the same API call then when “User2” gets the message, they can see that “User1” was also sent that message. I was wondering if it would be possible to provide an optional parameter to hide the names from the other users. I would like to be able to send several service related private messages (targeting multiple users at once) and have each user see it as only being sent to them.

Hopefully the feature has already been created. :slight_smile:

Why don’t you just send the messages to one user at a time?
You’re using the API anyway so that shouldn’t be much of an effort.

Well, I am at the moment but obviously there are rate limits to consider and I was hoping to be able to make one call into Discourse and not 50+.

1 Like

I wonder though, what you are describing appears to be an attempt at turning Discourse into MailChimp.

With our current rate limits you can get 50-100 PMs out there in a few minutes. You get to call the API 60 times a minute.

Where the API limit is a big problem is if you are trying to blast a PM to 20,000 users. But should you not be using MailChimp for that?

No, you see the problem I am beginning to notice with the Discourse API is that with the combination of “sharp” or specific API calls and even a moderate amount of users and you can quickly run to 60 a minute easy.

Let me give you a scenario. We run a support service that is tied to our customers software license. When their license expires, we want to do multiple things with a user in discourse…

  1. We need to find if the user first exists in Discourse. There is a chance that they may not be in Discourse yet. Here is one API call.

  2. If they do exist, we need to move them to another group. We have one group depending on their level of license entitlements. Here is another call.

  3. We want to send the user a message letting them know why their access has been downgraded (because their license expired). This is another call.

Each of these doesn’t seem bad with the rate limit. Now, if I run a script nightly to know which of our users’ licenses expired and it is over 20 (20 * 3 = 60) I am going to hit that rate limit. Now I am not talking thousands of users we are processing, but we can do anywhere from 50-80 a night… right now. We have over 7k users in discourse at the moment but that increases daily.

Now we can get around this limitation in two ways. We can slow down the calls into Discourse, which we do now. Our scripts sleep or we can look for ways to combine our API calls. Obviously if I can call Discourse once and process 50 private messages or move all users to a specific group, this is advantageous. It increases our throughput.

But as you can see, the calls add up. In of themselves it doesn’t seem like an issue. Compound them and they can begin to cripple you. Most APIs I have dealt with have the one off API endpoints, but also have bulk operations for “en mass” manipulations… sometimes for a higher price or business level class.

I am just exploring the options while helping you guys understand the possible need to scale some of these operations up a little. :slight_smile:

2 Likes

Looking up a batch of users is something that either should already exist or that we should add.

Adding a bunch of users to a group in an atomic operation is already supported.

Which only leaves the “batch PM” problem. I am pretty reluctant to add this cause it would queue a very large amount of work on the server, having a single API call that can trigger 1 hour of CPU work is very very risky.

1 Like

Oh of course, I understand that. It is that whole scale vs what is achievable in the short term. Obviously if you add more servers and all that, you can handle the extra capacity.

I also realize you do support some batched endpoints and we do use some of them. I was just providing a brief example of how some of this adds up. It is manageable right now, but obviously my company is constantly looking to do more and more. They are really trying to tie together two systems. One being our CRM and another being Discourse. I am not a huge fan of how this ramps up, but I can easily tell you that there will certainly be more and more interaction as time goes on. I have already made them aware of your API limitations but their focus is obviously on the business and customer and not the technical limitations. Those problems are for me. :wink:

On our hosting side, if you need to bump up the limits on API this is something we offer our Enterprise customers, the reason we can do this more safely is cause you have dedicated infrastructure. Additionally, on the enterprise tier the door would open for custom plugins (at a certain monthly cost) which would allow you to design a specific endpoint that operates on a batch of users according to your business rules.

A post was split to a new topic: API rate limits and cost difference with enterprise plans