Wie man mit plötzlichem hohem „Anderer Traffic“ in der Website-Analyse umgeht

I’ve had issues in July with tons of requests from Singapore. I blocked an IP range, which worked for a while, but the problem came back harder in August (from Singapore, Hong Kong and Mexico) with high and unexpected CDN cost :face_with_steam_from_nose:

I noticed high pageviews from Amazonbot, DataForSeoBot, meta-externalagent, SeekportBot, etc…

This documentation Controlling Web Crawlers For a Site says:

This list doesn’t contain some of my most visiting bots, but I have a question nonetheless.
Would it be advisable to add this whole list to the Blocked crawler user agents setting?
Is there a way to bulk add bot names from a .txt file?

1 „Gefällt mir“