Stopping abuse
by Volker Weber
I need to block abusive spiders. Not those run by search engines but users who download the whole site using some Mozilla plugin. I have no idea who wants thousands of old pages, but there have been two incidents now where people have not only downloaded the site once but actually twice.
I am thinking about adding a throttle, along the lines of "more than 30 requests per minute and your IP is blocked". Has anybody seen anything like this?
Update: Nils has provided a good hint (see comments). You can now request 15 pages in 60 seconds, which should be enough for everyone. If you exceed that rate, your IP is blocked for 60 seconds. That should stop robots but not humans.
Comments
something like this?
PHP Classes - Class: Flood Recorder
http://www.phpclasses.org/browse/package/2268.html
it might have been google's recently 'recalled' web-accellerator plug-in.
do you use "Flood Recorder" or did you found anything else?