You Have Performed this Action Too Many Times Error

I get the below error quite a few times.

{"errors":["You’ve performed this action too many times. Please wait a few seconds before trying again."],"error_type":"rate_limit","extras":{"wait_seconds":0}}

How do I eliminate this error. Please suggest.

1 Like

Go look in settings > Rate Limit. Change the values as you please

Hi @IAmGav,

I have made,

  1. rate limit create topic = 0 (After creating a topic, users must wait (n) seconds before creating another topic.)
  2. rate limit create post = 0 (After posting, users must wait (n) seconds before creating another post.)
  3. rate limit new user create topic = 0 (After creating a topic, new users must wait (n) seconds before creating another topic.)
  4. rate limit new user create post = 0 (After posting, new users must wait (n) seconds before creating another post.)

But, still the error is persistent. Are there any other limits I need to look at.

1 Like

I have disabled all the settings mentioned in:

But, still I get the “You Have Performed this Action Too Many Times” error.
Is there something else I have to do?

So what are you exactly doing what causes this error?

I am trying to create Topics using API. While trying to create like 100 topics through API, it is throwing me this error.
I am also trying to Update Tags to my Topics through API, there are 100s of Topics in my forum which do not have tags. So I am updating tags to them through API.

So what are the values of

DISCOURSE_MAX_USER_API_REQS_PER_MINUTE
DISCOURSE_MAX_USER_API_REQS_PER_DAY
DISCOURSE_MAX_ADMIN_API_REQS_PER_KEY_PER_MINUTE

and are you sure you’re staying below those numbers?

2 Likes

Hello :wave:
I’m facing the same issue but with read actions

I’m building an integration with Discourse and I use API to read a lot of posts. I don’t do any write operations, just read. In order to get the latest posts I do following:

  1. Get latest topics using /latest.json endpoint
  2. Sequentially get all the topics via /t/:id so I can get the stream of posts and paginate through it
  3. If there are more then 20 posts in that topic get heir ids from “stream” and sequentially fetch them in chunks with size 20

Also I do all requests in a queue and try to send not more than ~25 requests per 10 seconds but still I often see “You Have Performed this Action Too Many Times Error” for topic or posts read request. I went to Discourse settings but can’t find any limits for Read here. I can see only limit for “Create topics” and other write operations

Is there anything I can do about it? Thank you for any tips and sorry for bringing up an old topic

1 Like

Looks like I’m hitting the max_admin_api_reqs_per_minute limit. Can it be customized? Can’t see it in Settings > Rate limits

EDIT: actually looks like two limits kick in there. admin_api_key_rate_limit and ip_10_secs_limit

1 Like

I’m curious if adding ?print=true will help reduce the number of API calls when reading?

This will allow you to fetch 1000 posts in a single api call.

Oh, for some reasons I thought that ?print has even stricter rate limits

But seems like it’s not about ?print=true usage but something else. Will definitely try it.

But as I undestand it will only help with topics where there are more than 20 posts. I believe most of our topics has less so the actual bottleneck is too many requests for topics

1 Like

Ya that is for limiting users. If you have an Admin api key that setting doesn’t affect you.

Ah yes, very likely then.

Besides just checking for 429 errors and slowing down for the amount specified there are a couple of options.

I would start with using the data explorer plugin to write a query to get all of the topics you are after. I believe it will return up to 1000 results. You can then use the api to call the query and get the response.

Depending on your use case webhooks could also be helpful here. You could set them up for each new topic and post and just listen for all the latest content.

If you still determine that you need to raise the api rate limits, that is something we can do, but only for sites on our enterprise plan as they are not on our shared standard/business hosting.

The problem with using data explorer plugin is that we are not the only users of Discourse integration. We (fibery.io) allow our customers to integrate their discourse instances so they can seamlessly synchronize the data to our tool.

Webhooks is a nice addition but they won’t help during the first synchronization (which is the heaviest, e.g. topics and posts for last month). After that we do scheduled syncs for new data and they are not a problem

Checking 429 and retrying works ok just can be slow sometimes)

Thank you for looking into :bow:

1 Like