Over the last days, Google Bot tries to read one URL of our main site over and over again, leading to a DDOS attack :) Our website got very slow because of the massive requests of the Google Crawler.
Here an excerpt for the curious ones (or if a Google engineers reads this post):
66.249.76.54 - - [27/May/2019:06:31:23 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/tag/594749/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/user/impressum HTTP/1.0" 200 32603
66.249.76.54 - - [27/May/2019:06:31:23 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/403551/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/bestusers HTTP/1.0" 200 32603
66.249.76.55 - - [27/May/2019:06:31:23 +0200] "GET /235432/tag/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/403551/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/403551/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/schreibregeln HTTP/1.0" 200 32603
66.249.76.54 - - [27/May/2019:06:31:23 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/594749/tag/chat HTTP/1.0" 200 32603
Or here (see the different IPs, so there are several bots):
66.249.76.54 - - [27/May/2019:09:24:42 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/386961/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/user/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/punkte HTTP/1.0" 200 32587
66.249.76.55 - - [27/May/2019:09:24:42 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/user/403551/agb HTTP/1.0" 200 32587
66.249.76.56 - - [27/May/2019:09:24:42 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/user/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/403551/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/qa-theme/lounge/js/lounge.min.js?v=2019-01-17 HTTP/1.0" 200 32587
66.249.76.55 - - [27/May/2019:09:24:42 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/594749/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/luckentext-zum-thema-extrema-funktionenschar-ft-x-1-2-tx-2-2-t HTTP/1.0" 200 32587
66.249.76.58 - - [27/May/2019:09:24:42 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/323274/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/tag/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/323274/user/Lu HTTP/1.0" 200 32587
66.249.76.57 - - [27/May/2019:09:24:42 +0200] "GET /235432/~plot~+4x%5E2%3B+4%2Ax%5E2+++4%2A%281/32%29%2Ax+-+15%2A%281/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/154807/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/323274/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/235432/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/32)*x%20-%2015*(1/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/user/user/tag/tag/~plot~%204x%5E2;%204*x%5E2%20+%204*(1/badges HTTP/1.0" 200 32587
The false link that was leading to this problem:
~plot~ 4x^2; 4*x^2 + 4*(1/32)*x - 15*(1/32)^2; x; [[0.1]]~plot~
Where can I report a bug of GoogleBot?
There seems to be no official way how to report a bug.
Here is the link to report a crawling bug of Google Bot:
https://www.google.com/webmasters/tools/googlebot-report
Report a problem with how Googlebot crawls your site.
You can report problems only for domain-level properties (for example, "www.example.com/")
The rate at which Google crawls your page depends on many factors:
The URLs we already know about
Links from other web pages (within your site and on other sites)
URLs listed in your Sitemap.
For most sites, Googlebot shouldn't access your site more than once every few seconds on average.
Probably the report link was not easy to find since you need a Google Webmaster account and can apparently only report your own websites.
Related
Today I check the access log of my website, and find a very strange thing.
When user accessing the file "/downloads/files/ddbfr.exe" with user agent "LWP::Simple/6.13 libwww-perl/6.13", he will get 404 error:
108.162.215.191 - - [22/Dec/2021:12:30:35 -0800] "GET /dbf-repair/ddbfr.exe HTTP/1.1" 301 - "-" "LWP::Simple/6.13 libwww-perl/6.13"
172.70.98.107 - - [22/Dec/2021:12:30:36 -0800] "HEAD /downloads/files/ddbfr.exe HTTP/1.1" 404 - "-" "LWP::Simple/6.13 libwww-perl/6.13"
108.162.215.191 - - [22/Dec/2021:12:30:36 -0800] "GET /dbf-repair/ddbfr.exe HTTP/1.1" 301 - "-" "LWP::Simple/6.13 libwww-perl/6.13"
172.70.98.33 - - [22/Dec/2021:12:30:37 -0800] "HEAD /downloads/files/ddbfr.exe HTTP/1.1" 404 - "-" "LWP::Simple/6.13 libwww-perl/6.13"
However, when accessing the same file "/downloads/files/ddbfr.exe" with other user agent, such as FireFox, then everything will be OK:
172.70.210.253 - - [22/Dec/2021:13:10:28 -0800] "GET /downloads/files/ddbfr.exe HTTP/1.1" 200 8035760 "-" "Mozilla/5.0 (X11; Linux x86_64; en-US; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12"
It seems that my server is returned 404 for specific agent. However, I check my .htaccess file but cannot find anything like the one in 301 redirect based on user-agent. Why?
Note: There is a redirect from /dbf-repair/ddfr.exe to /downloads/files/ddbfr.exe in .htaccess.
Forgive me if this is a duplicate question, but I'm running a Flask app using Celery and Rabbit MQ in a Kubernetes service. The service is run as a public-facing LoadBalancer. The problem I've seen is with ZMEU scan attacks which mess up the Flask uri structure and render the app unusable:
10.240.0.4 - - [19/Apr/2018 04:48:05] "GET / HTTP/1.1" 400 -
10.240.0.4 - - [19/Apr/2018 04:48:10] "GET /index.action HTTP/1.1" 400 -
10.244.2.1 - - [19/Apr/2018 07:32:29] "GET / HTTP/1.1" 400 -
10.244.2.1 - - [19/Apr/2018 08:54:38] "GET / HTTP/1.1" 400 -
10.244.2.1 - - [19/Apr/2018 11:13:06] "GET /w00tw00t.at.blackhats.romanian.anti-sec :) HTTP/1.1" 400 -
10.240.0.4 - - [19/Apr/2018 11:13:11] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 400 -
10.240.0.4 - - [19/Apr/2018 11:13:16] "GET /phpmyadmin/scripts/setup.php HTTP/1.1" 400 -
10.244.2.1 - - [19/Apr/2018 11:13:21] "GET /pma/scripts/setup.php
...
This occurs after successful pings to the URI. I have a healthcheck occurring because this silently renders the app unreachable, though the container itself is healthy and doesn't exit.
While I am working on securing the endpoint itself, I feel that it's more of a band-aid, and the app code should be resilient to this type of attack. Is there a way for me to catch any unspecified URI (not defined by an #app.route() decorator)?
EDIT: I've made an attempt to only accept requests that have a uri header in the request by using redirects in a general uri-path http://my.flask.app:80/ to point to the proper service, but this hasn't remedied the problem
I am using confluent kafka platform . I have a topic with 4 partition and replication factor of 2. Single zookeeper, three brokers and kafka-rest proxy server. Now I am load testing the system with siege running 1000 users with a list of api which in turn hit kafka producer. I have my producer and consumer using the rest proxy (kafka-rest). I am getting following issue:
{ [Error: getaddrinfo EMFILE] code: 'EMFILE', errno: 'EMFILE', syscall: 'getaddrinfo' }
In kafka-rest log I can see:
[2016-02-23 07:13:51,972] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 14 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,973] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 15 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,974] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 12 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,978] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 6 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,983] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 6 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,984] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 4 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,985] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 7 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,993] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 3 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,994] INFO 127.0.0.1 - - [23/Feb/2016:07:13:51 +0000] "POST /topics/endsession HTTP/1.1" 200 120 4 (io.confluent.rest-utils.requests:77)
[2016-02-23 07:13:51,999] WARN Accept failed for channel java.nio.channels.SocketChannel[closed] (org.eclipse.jetty.io.SelectorManager:714)
java.io.IOException: Too many open files
at sun.nio.ch.ServerSocketChannelImpl.accept0(Native Method)
at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:422)
at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:250)
at org.eclipse.jetty.io.SelectorManager$ManagedSelector.processAccept(SelectorManager.java:706)
at org.eclipse.jetty.io.SelectorManager$ManagedSelector.processKey(SelectorManager.java:648)
at org.eclipse.jetty.io.SelectorManager$ManagedSelector.select(SelectorManager.java:611)
at org.eclipse.jetty.io.SelectorManager$ManagedSelector.run(SelectorManager.java:549)
at org.eclipse.jetty.util.thread.NonBlockingThread.run(NonBlockingThread.java:52)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
So I went through a lot of questions related to this. Set my ec2 machine paramenters so that I dont get too many open file error. But its not solved. I have reduced the TIME_WAIT to 30 seconds. ulimit -n is 80000.
I have collected some stats and look like kafka rest proxy which is running on `localhost:8082 causing too many connections.
How do I solve this issue? Also sometimes when error is coming and then I stop my siege test but again when TIME_WAIT connections are reduced, I restart my load test with 1 user only still I see the same issue. Some issue in rest proxy wrapper for node js?
`
You need to increase the ulimit for that process. In order to check the ulimit for a particular process , run this:
sudo cat /proc/<process_id>/limits
in order to increase the ulimit for process running via supervisord, you can increase minfds in supervisord.conf
I have a Magento 1.9.2 system that is currently undergoing a brute force attack against its xml-rpc endpoint from an EC2 host.
I can simply firewall the source address but that is a short term solution, since it will likely face another attack from a different address. I would like to be able to detect these attacks automatically to lock them down.
Fail2ban is commonly used under such circumstances but in order for it to work, I understand that it must be able to find login failure messages in a log file somewhere, however Magento does not seem to be logging the failed attempts.
How can I prevent the xml-rpc endpoint being brute forced?
54.246.87.74 - - [20/Jul/2015:13:10:24 +0000] "POST /index.php/api/xmlrpc/ HTTP/1.1" 200 777 "-" "XML-RPC.NET"
54.246.87.74 - - [20/Jul/2015:13:10:24 +0000] "POST /index.php/api/xmlrpc/ HTTP/1.1" 200 777 "-" "XML-RPC.NET"
54.246.87.74 - - [20/Jul/2015:13:10:25 +0000] "POST /index.php/api/xmlrpc/ HTTP/1.1" 200 777 "-" "XML-RPC.NET"
54.246.87.74 - - [20/Jul/2015:13:10:26 +0000] "POST /index.php/api/xmlrpc/ HTTP/1.1" 200 777 "-" "XML-RPC.NET"
54.246.87.74 - - [20/Jul/2015:13:10:27 +0000] "POST /index.php/api/xmlrpc/ HTTP/1.1" 200 777 "-" "XML-RPC.NET"
Action taken so far
I've configured fail2ban with a new filter and jail to lock it down but I still don't know if this is the best solution.
filter.d/magento-xmlrpc.conf
[Definition]
failregex = ^<HOST> .*POST .*api\/xmlrpc\/
ignoreregex =
jail.local
[magento-xmlrpc]
enabled = true
port = http,https
filter = magento-xmlrpc
logpath = /home/user/logs/access.log
maxretry = 20
findtime = 30
bantime = 600
I'm having a setup with MCollective 1.2.0, Puppet 2.6.4 and an
provision-agent. Most of the time this works great, but sometimes
(every 10th node or so) I experience, that signing-requests of puppet-
agents are not getting signed on the master.
In this case every request of the puppet agent to the "/production/certificate/..." fails with an HTTP-Error 404.
The problem is also hard to analyze because the logoutput is not very
detailed.
Puppet-Agent:
Jun 18 16:10:38 ip-10-242-62-183 puppet-agent[1001]: Creating a new SSL key for ...
Jun 18 16:10:38 ip-10-242-62-183 puppet-agent[1001]: Caching certificate for ca
Jun 18 16:10:41 ip-10-242-62-183 puppet-agent[1001]: Creating a new SSL certificate request for ...
Jun 18 16:10:41 ip-10-242-62-183 puppet-agent[1001]: Certificate
Request fingerprint (md5): 6A:3F:63:8A:59:2C:F6:C9:5E:56:5F:39:16:FF:19:BE
Puppet-Master:
"GET /production/certificate/a.b.c.d HTTP/1.1" 404
"GET /production/certificate_request/a.b.c.d HTTP/1.1" 404
"GET /production/certificate/a.b.c.d HTTP/1.1" 404
"GET /production/certificate/a.b.c.d HTTP/1.1" 404
"GET /production/certificate/a.b.c.d HTTP/1.1" 404
"GET /production/certificate/a.b.c.d HTTP/1.1" 404
... last message repeats endlessly
Does anyone have a glue about that?
Markus