Hi there, I wanted to report some aggressive crawling by the bot with the user agent
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot)
It seems to be a bot by amazon but I couldn’t check the originating IP addresses to confirm that.
This is what the last 5 days look like:
For comparison, this our user agents table for the last two days. 39649 vs 457
I personally don’t care too much about this as we’re not the ones doing the hosting and we haven’t noticed performance issues but CDCK is. So I figured this could be interesting to share here.
From our site and container logs it appears that there was a spike only that particular day and only on that site
May 1st:
Client IP
Amazonbot*
107.23.182.118
3,560
54.90.49.0
3,210
35.175.129.27
3,204
3.80.18.217
2,646
35.153.79.214
2,529
34.201.164.175
2,432
107.21.55.67
1,959
34.204.61.165
1,538
18.208.120.81
1,473
100.25.191.160
1,276
* Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot)
I see. Thanks for checking it. Probably a technical user, having a bad day and making a trashy bot to target our website with no effect. We’ve since blocked that crawler.
Since I myself was just hit by something like this…
I’m very happy for Alexa to be able to use my site content to answer questions, so I don’t really want to block it. However, I just saw a burst of three days of heavy traffic from AmazonBot (relative to all other site usage, including all other bots combined, as well as all other site traffic overall), and I see that Amazon says:
AmazonBot does not support the crawl-delay directive in robots.txt
It seems like it is prudent, therefore, to add Amazonbot to slow_down_crawler_user_agents so that they do not have an outsized impact on site performance for users.
Thanks, Discourse folks, for implementing functionality that crawlers ought to, but in this case do not.