Stopping abuse

by Volker Weber

I need to block abusive spiders. Not those run by search engines but users who download the whole site using some Mozilla plugin. I have no idea who wants thousands of old pages, but there have been two incidents now where people have not only downloaded the site once but actually twice.

I am thinking about adding a throttle, along the lines of "more than 30 requests per minute and your IP is blocked". Has anybody seen anything like this?

Update: Nils has provided a good hint (see comments). You can now request 15 pages in 60 seconds, which should be enough for everyone. If you exceed that rate, your IP is blocked for 60 seconds. That should stop robots but not humans.


something like this?
PHP Classes - Class: Flood Recorder

Nils K. Windisch, 2005-05-17

it might have been google's recently 'recalled' web-accellerator plug-in.

brendan avery, 2005-05-17

do you use "Flood Recorder" or did you found anything else?

Nils K. Windisch, 2005-05-18

Old archive pages

I explain difficult concepts in simple ways. For free, and for money. Clue procurement and bullshit detection.


Paypal vowe