Is there a way to view IP addresses of anonymous users / guests?
And/or the number of connections from each?
My site has been getting hundreds and hundreds of page views per minute, for the last few hours, processor usage is maxed out to 100% too.
Is there a way to view IP addresses of anonymous users / guests?
And/or the number of connections from each?
My site has been getting hundreds and hundreds of page views per minute, for the last few hours, processor usage is maxed out to 100% too.
Have you checked out your logs of Nginx (or what ever you are using)? Those are bots and knockers. You don’t do anything with IPs. Those will change after few tries (IPs are useless anyway, they will always change anyway). You should start do geo-blocking (quite few sites are really true global) and ban plenty of user agent at server-level; at least all totally unnecessary and greedy SEO-scrapers, bots that identifie themselves as IE5.x etc.
If you did a standard installation there should be rate limiting at play (and some is inside Rails). How did you install?
How is that rate limiting working actually (no, I didn’t search answer). Will it kick in after amount of requests in some timeframe per IP?
Anyway, when there is ddos’ish situation in the meaning of for example if URL is mentioned in some stupid list then starts flood from IPs from China, Pakinstan, Iran, Irak, Vietnam and Russia plus a lot from big VPS-services, mainly from USA, France and Germany. When they try 3 times and changes IP rate limiting doesn’t help too much.
I got at some point a lot stupid searches. And a lot means 5USD droplet by DigitalOcean crashed and I had almost zero requests from humans.
This is more or less matter of webserver, not Discourse. Those knockers should kill before an app. I know that my situation/solutions are much easier than OP’s or most of webmasters here because I’m from Finland and my forum is pure finnish — so banning world wide is possible for me (well, outside of Finland living finns see that differently )
But regardless rate limiting at least false user agents should stop right away.
How’s SSH-knockers? Those eat resources too.
If you’re having ddos issues then I would recommend cloudflare (which I almost never do). Make sure that you use the cloudflare template and then off the optimizations (or read carefully about configuring that).
It is not real ddos. But when there is a lot of same type requests those act like ddos, the result is same. I’m using Varnish and Fail2ban plus geo-blocking and these do the job for me.
Yup, I did a standard install.
My first thought was simply being slashdotted, but 24hrs later I’m still getting hammered.
The quick temporary fix was to shut the front door and set the site to “login required”. Which worked, CPU usage dropped from 100% down to 3% within 60 seconds.
As soon as I unlock the front door my /search page is instantly being hammered with rubbish like:
/search?q=dogs+order:latest&page=2
/search?q=cats+order:latest&page=2
/search?q=fly+order:latest&page=2
etc
This happens within 60 seconds of opening the site again.
We’re a small niche group, not sure why anyone would target us with anything
It’s only an estimate but according to Google Analytics we normally have 8 to 10 active users and this spikes to 1,000+ within seconds of me opening the site back up to the public again. All connections show as coming from various parts of USA, all direct with no referrers.
I’ll leave the site closed to members only for a few days and see if it goes away, or see if I can limit the /search
to logged in users only, and if not I’ll probably have to go the cloudflare route.
Thanks everyone
Yeah. That’s bizarre, but i guess that’s the internet.
Ohh. What is the user agent? Maybe you could add to blocked crawler user agents
?
Good question!
I’ll see if I can find that on the Google Analytics report page
I think you can also look in /var/discourse/shared/log/var-log/nginx/....
(or something very much like that). THere are some other settings like slow down crawler user agents
if you search settings for “agent”
Found it in: /var/discourse/shared/standalone/log/var-log/nginx/access.log
(downloading locally now)
Thanks @pfaffman I’ll see if anything jumps out at me.
A quick scan seems to show a varied mixture of user agents.
This is also making me suspect an attack now rather than a slashdot
[12/Jan/2022:13:26:20 +0000] "greyarro.ws" 184.174.102.229 "GET /search?q=cats+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.57 Safari/537.36" "search/show" 302 1117 "-" 0.012 0.009 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:13:27:22 +0000] "greyarro.ws" 173.211.78.162 "GET /search?q=cats+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.9390.0.4430.212 Safari/537.36" "search/show" 302 1117 "-" 0.012 0.009 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:13:30:46 +0000] "greyarro.ws" 66.78.24.176 "GET /search?q=cats+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (X11; Linux x86_64; rv:89.0.2) Gecko/20100101 Firefox/89.0.2" "search/show" 302 1117 "-" 0.020 0.019 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:16:10:32 +0000] "greyarro.ws" 38.18.59.158 "GET /search?q=cats+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.86 Safari/537.36" "search/show" 302 1117 "-" 0.008 0.011 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:16:10:57 +0000] "greyarro.ws" 108.62.69.249 "GET /search?q=cats+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:88.0.1) Gecko/20100101 Firefox/88.0.1" "search/show" 302 1117 "-" 0.008 0.009 "-" "-" "-" "-" "-" "-" "-"
and:
[12/Jan/2022:16:11:07 +0000] "greyarro.ws" 38.18.49.252 "GET /search?q=dogs+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.78 Safari/537.36" "search/show" 302 1117 "-" 0.012 0.011 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:16:28:08 +0000] "greyarro.ws" 206.180.185.39 "GET /search?q=dogs+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:88.0) Gecko/20100101 Firefox/88.0" "search/show" 302 1117 "-" 0.016 0.012 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:16:28:08 +0000] "greyarro.ws" 38.18.55.132 "GET /search?q=dogs+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Windows NT 6.2; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.57 Safari/537.36" "search/show" 302 1117 "-" 0.008 0.009 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:16:28:10 +0000] "greyarro.ws" 184.174.54.113 "GET /search?q=dogs+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.69 Safari/537.36" "search/show" 302 1117 "-" 0.012 0.011 "-" "-" "-" "-" "-" "-" "-"
[12/Jan/2022:16:28:14 +0000] "greyarro.ws" 184.174.72.90 "GET /search?q=dogs+order%3Alatest&page=2 HTTP/2.0" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:89.0.1) Gecko/20100101 Firefox/89.0.1" "search/show" 302 1117 "-" 0.016 0.017 "-" "-" "-" "-" "-" "-" "-"
I fear we’re drifting off the original topic here a bit, but appreciate the help thanks
All those setting helps only if a bot will follow limits and guides. Even googlebot doesn’t all the time and bad ones never. That is one reason why robots.txt is so useless.
And disclaimer. I don’t know if Discourse is using some other tech to slow down.
No, we aren’t. Because your question was a bit off You wanted to know one small detail when you should ask ”what to do when my Discourse is under bot attack/ddos”.