Weblabyrinth is another offensive countermeasure that no one had mentioned yet. It's still pretty new but worth keeping an eye on. > > It basically creates a bunch of bogus pages that traps scripts that are trying to crawl your website. It also logs anything that gets trapped so you can tie this to your monitoring and alerting solution. You can set it up to trigger a temporary firewall rule banning traffic on any IPs that fall into the trap. Or just let them churn down the rabbit hole. > > You already have a good list of rewrite rules so just point those to your labyrinth. Another way to entice bad traffic to the labyrinth is to put bogus entries in your robots.txt that look like that point some where interesting and rewrite them to your labyrinth. > > http://www.mayhemiclabs.com/content/new-tool-weblabyrinth > http://code.google.com/p/weblabyrinth/ Thanks David, this looks very interesting. -Jason