I just got SLAMMED with accessed to httpd from 91.230.121.156
I added the address to my firewall to drop it. FYI
host 91.230.121.156 156.121.230.91.in-addr.arpa domain name pointer no-rdns.offshorededicated.net.
Jerry
On 2014-10-02 10:23 am, Jerry Geis wrote:
I just got SLAMMED with accessed to httpd from 91.230.121.156
I added the address to my firewall to drop it. FYI
host 91.230.121.156 156.121.230.91.in-addr.arpa domain name pointer no-rdns.offshorededicated.net.
Are you running Wordpress?
My company's Wordpress installation was getting hammered by an IP in the same netblock, yesterday...look in your httpd logs for repeated POST operations to xmlrpc.php.
On Thu, Oct 2, 2014, at 09:29, Mike Burger wrote:
On 2014-10-02 10:23 am, Jerry Geis wrote:
I just got SLAMMED with accessed to httpd from 91.230.121.156
I added the address to my firewall to drop it. FYI
host 91.230.121.156 156.121.230.91.in-addr.arpa domain name pointer no-rdns.offshorededicated.net.
Are you running Wordpress?
My company's Wordpress installation was getting hammered by an IP in the same netblock, yesterday...look in your httpd logs for repeated POST operations to xmlrpc.php.
Most people don't even need xmlrpc.php to be open to the world, so I prefer to block all requests to it. I also have successfully used ngrep to capture POSTs on a server hosting many Wordpress sites and log them to a file that is watched by fail2ban. After x many POSTs automatically ban the IP, for example.
The reason I did not just monitor the Apache log files for POSTs is that there were so many sites with their own log files . I had to aggregate all the POSTs to a single log file so when the botnet hit multiple Wordpress sites it could be more easily identified. Occasionally they'll only do a couple POSTs from each IP/bot in the group and so it would evade detection unless you aggregated it all into one log file.
I use Fail2Ban which is available from the EPEL repo to ban these addresses. Works well for SSH attacks by skriptkiddies as well. I usually block an address for 8 hours.
On 10/02/2014 10:29 AM, Mike Burger wrote:
On 2014-10-02 10:23 am, Jerry Geis wrote:
I just got SLAMMED with accessed to httpd from 91.230.121.156
I added the address to my firewall to drop it. FYI
host 91.230.121.156 156.121.230.91.in-addr.arpa domain name pointer no-rdns.offshorededicated.net.
Are you running Wordpress?
My company's Wordpress installation was getting hammered by an IP in the same netblock, yesterday...look in your httpd logs for repeated POST operations to xmlrpc.php.
--
David P. Both, RHCE Millennium Technology Consulting LLC Raleigh, NC, USA 919-389-8678
dboth@millennium-technology.com
www.millennium-technology.com www.databook.bz - Home of the DataBook for Linux DataBook is a Registered Trademark of David Both
This communication may be unlawfully collected and stored by the National Security Agency (NSA) in secret. The parties to this email do not consent to the retrieving or storing of this communication and any related metadata, as well as printing, copying, re-transmitting, disseminating, or otherwise using it. If you believe you have received this communication in error, please delete it immediately.
Are you running Wordpress?
My company's Wordpress installation was getting hammered by an IP in the same netblock, yesterday...look in your httpd logs for repeated POST >
operations to xmlrpc.php.
yes that is it.
Jerry
On Thu, Oct 2, 2014 at 10:23 AM, Jerry Geis geisj@pagestation.com wrote:
I just got SLAMMED with accessed to httpd from 91.230.121.156
I added the address to my firewall to drop it. FYI
host 91.230.121.156 156.121.230.91.in-addr.arpa domain name pointer no-rdns.offshorededicated.net.
Jerry
On 10/02/2014 09:48 AM, m.roth@5-cent.us wrote
Install fail2ban
I followed this tutorial last year; perhaps you can glean some useful info there:
https://www.digitalocean.com/community/tutorials/how-to-protect-ssh-with-fai...
Note: I have absolutely zero real-life experience with web security other than a single site I maintain...
Chris Pemberton wrote:
On 10/02/2014 09:48 AM, m.roth@5-cent.us wrote
Install fail2ban
I followed this tutorial last year; perhaps you can glean some useful info there:
https://www.digitalocean.com/community/tutorials/how-to-protect-ssh-with-fai...
Note: I have absolutely zero real-life experience with web security other than a single site I maintain...
Have a look in /etc/fail2ban/filters.d, at the apache filters, and see if you can just uncomment things out, or use them as models for what you need.
mark
Disabling XMLRPC completely via wp-config.php is quite easy.. I can send required info when I'm in front of a computer. You can also use an .htaccess rule for Apache to stop requests completely. I'm sure there's also rules for Nginx, lighttpd, etc that can be found quite easily via Google. Surprised most people don't have this disabled/blocked already.
— Sent from Mailbox
On Thu, Oct 2, 2014 at 11:01 AM, Chris Pemberton cjpembo@gmail.com wrote:
On 10/02/2014 09:48 AM, m.roth@5-cent.us wrote
Install fail2ban
I followed this tutorial last year; perhaps you can glean some useful info there: https://www.digitalocean.com/community/tutorials/how-to-protect-ssh-with-fai... Note: I have absolutely zero real-life experience with web security other than a single site I maintain... _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
On Thu, 2 Oct 2014, jwyeth.arch@gmail.com wrote:
Disabling XMLRPC completely via wp-config.php is quite easy.. I can send required info when I'm in front of a computer. You can also use an .htaccess rule for Apache to stop requests completely. I'm sure there's also rules for Nginx, lighttpd, etc that can be found quite easily via Google. Surprised most people don't have this disabled/blocked already.
Another good trick to keep IP-based scanners off your back is to make sure that all HTTP requests have a valid Host: header. In Apache, it's easy. The first-listed <VirtualHost> declaration is the default if a client fails to provide a Host: header in the request. So the initial Virtual host is basically a deny-all container, e.g.,
<VirtualHost *:80> ServerSignature off <Location /> <RequireAny> Require local Require ip [some administrative IP addr] </RequireAny> </Location> </VirtualHost>
<VirtualHost *:80> ServerName www.you.com # the real work happens here ... </VirtualHost>
For extra credit, you can write a fail2ban filter that scans the default ErrorLog for telltale signs of IP-based scanning (watch out for unintended line-wrapping in the example below).
# /etc/fail2ban/filter/apache-iponly.conf [DEFAULT]
_apache_error_msg = [[^]]*] [\S*:error] [pid \d+] [client <HOST>(:\d{1,5})?]
[Definition]
failregex = ^%(_apache_error_msg)s (AH0\d+: )?client denied by server configuration: (uri )?.*$ ^%(_apache_error_msg)s script '\S+' not found or unable to stat(, referer: \S+)?\s*$
On Thu, 2014-10-02 at 09:44 -0700, Paul Heinlein wrote:
On Thu, 2 Oct 2014, jwyeth.arch@gmail.com wrote:
Another good trick to keep IP-based scanners off your back is to make sure that all HTTP requests have a valid Host: header. In Apache, it's easy. The first-listed <VirtualHost> declaration is the default if a client fails to provide a Host: header in the request. So the initial Virtual host is basically a deny-all container, e.g.,
<VirtualHost *:80> ServerSignature off
<Location /> <RequireAny> Require local Require ip [some administrative IP addr] </RequireAny> </Location> </VirtualHost>
<VirtualHost *:80> ServerName www.you.com # the real work happens here ...
</VirtualHost>
All my web sites are configured as virtual hosts. The 'empty' default web site (one on every server) redirects all requests to 127.0.0.1. Sometimes I change this a Chinese consumer site. Why give the hackers and pests an opportunity to annoy you - send them away before their requests can be done to your web site.
xx.xx.xx.xx is the web server's IP address. Some of the configuration relates to the previous system of banning every IP directly accessing the server's IP address.
<VirtualHost xx.xx.xx.xx:80> DocumentRoot /data/web/do/default/www ServerName xx.xx.xx.xx CustomLog /data/web/weblogs/acc.000118 combined ErrorLog /data/web/weblogs/err.000118.w DirectoryIndex banned.php HostnameLookups Off <Directory /data/web/do/default/www/> RedirectMatch permanent ^/(.*)$ http://127.0.0.1/ </Directory> </VirtualHost>
The real web sites have entries beginning with, for example, ...
<VirtualHost example.com:80 www.example.com:80>
On Fri, 3 Oct 2014, Always Learning wrote:
All my web sites are configured as virtual hosts. The 'empty' default web site (one on every server) redirects all requests to 127.0.0.1. Sometimes I change this a Chinese consumer site. Why give the hackers and pests an opportunity to annoy you - send them away before their requests can be done to your web site.
I've always assumed (without any data whatsoever on which to base that assumption) that scanbots won't follow redirects to different addresses. Do you have any information to the contrary?
On Fri, 2014-10-03 at 10:03 -0700, Paul Heinlein wrote:
On Fri, 3 Oct 2014, Always Learning wrote:
All my web sites are configured as virtual hosts. The 'empty' default web site (one on every server) redirects all requests to 127.0.0.1. Sometimes I change this a Chinese consumer site. Why give the hackers and pests an opportunity to annoy you - send them away before their requests can be done to your web site.
I've always assumed (without any data whatsoever on which to base that assumption) that scanbots won't follow redirects to different addresses. Do you have any information to the contrary?
Alas, it is not the inevitable crawlers from recognised major search engines, computer start-up companies, curious people but also the determined hackers attempting to probe and break-in.
If a crawler wants a web site, then the crawler should follow the domain name and not an IP address. Conversely hackers chose IP addresses primarily and domain names secondarily.
I'm merely redirecting IP curious to 127.0.0.1
Hope that helps.
Regards,
Paul. England, EU.
Learning until I die or experience dementia.
On Fri, 2014-10-03 at 10:03 -0700, Paul Heinlein wrote:
I've always assumed (without any data whatsoever on which to base that assumption) that scanbots won't follow redirects to different addresses. Do you have any information to the contrary?
If you mean by 'scanbot' a web crawler, then YES.
I recently moved a sub-domain web site on 1111.2222.com to 1111.3333.com and redirected traffic, including crawlers, with a simple:-
<VirtualHost 1111.2222.com:80> ............. ............... .......... <Directory /data/web/do/2222/1111/> RedirectMatch permanent ^/(.*)$ http://1111.3333.com/$1 </Directory> </VirtualHost>
and Google followed. So did other crawlers.
I have previously, and successfully, moved whole web sites to different domains and got the crawlers to follow by using the same method.
Hope this helps.
Regards,
Paul. England, EU.
On Thu, Oct 2, 2014 at 11:52 AM, jwyeth.arch@gmail.com wrote:
Disabling XMLRPC completely via wp-config.php is quite easy.. I can send required info when I'm in front of a computer. You can also use an .htaccess rule for Apache to stop requests completely. I'm sure there's also rules for Nginx, lighttpd, etc that can be found quite easily via Google. Surprised most people don't have this disabled/blocked already.
+1
I wrote an Apache rewrite rule (saved it in a separate file) that I can include on any WordPress sites getting hammered by requests to xmlrpc. There's also wp-login as well that gets brute forced from time to time.
I was kicking back a HTTP 410 (gone, as opposed to 403 or 404). Of course they're stupid bots, so they keep hammering away!
With some ISPs using NAT, I prefer the rewrite rule solution ... that way it stops the requests and doesn't block the IP entirely. Pros and cons of course, but I prefer a conservative approach first rather than a heavy handed one.