-----Original Message----- From: centos-bounces@centos.org [mailto:centos-bounces@centos.org] On Behalf Of Keith Keller Sent: Sunday, August 28, 2016 4:23 PM To: centos@centos.org Subject: Re: [CentOS] .htaccess file
On 2016-08-28, TE Dukes tdukes@palmettoshopper.com wrote:
I'm just not following or understanding. The .htaccess file works but on a slow DSL, I don't want the hits.
What exactly is slow when you receive requests from remote clients that
you
don't want? Are you actually seeing problems when clients make requests and Apache has to read in your 2MB .htaccess on every request? And if so, you might also consider moving your blocking even higher, to iptables rules, so that Apache never even has to deal with them.
I added the following to my httpd.conf:
<Directory "/var/www/htdocs"> AddType text/htdocs ".txt"
</Directory>
And copied my .htaccess to /var/www/htdocs as htaccess.txt
Where did you get the idea that this is how to do global Apache configuration? This won't actually do anything useful.
In the example from the apache website, I don't get the: AddType text/example ".exm" Where did they come up .exm?
They made it up as an example, to demonstrate how directives work in .htaccess files versus global Apache config files. It's not meant to demonstrate how to add blocking rules to the global config.
Here's the main point of that page:
"Any directive that you can include in a .htaccess file is better set in a Directory block, as it will have the same effect with better performance."
So, to achieve what I think you're hoping, take all the IPs you're denying
in
your .htaccess file, put them into a relevant Directory block in a config
file
under /etc/httpd, reload Apache, and move your .htaccess file out of the way. Then httpd will no longer have to read in .htaccess for every HTTP request.
Or, alternatively, block those IPs using iptables instead. However,
clients will
still be able to make those requests, and that will still use bandwidth on
your
DSL. The only way to eliminate that altogether is to block those requests
on
the other side of your link. That's something you'd have to work out with your ISP, but I don't think it's common for ISPs to put up blocking rules
solely
for this purpose, or to allow home users to configure such blocks
themselves.
--keith
[Thomas E Dukes] I setup an ipset but quickly ran out of room in the set. I guess I'll have to setup multiple sets. Right now, I'm just trying to take some load off my home server from badbots but I am getting hit on other services as well.
There's nothing on the webserver except a test site I use. Just trying to keep out the ones that ignore robots.txt
Thanks!!