httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Graham Leggett <minf...@sharp.fm>
Subject Re: URL scanning by bots
Date Wed, 01 May 2013 08:26:16 GMT
On 01 May 2013, at 2:47 AM, André Warnier <aw@ice-sa.com> wrote:

> With respect, I think that you misunderstood the purpose of the proposal.
> It is not a protection mechanism for any server in particular.
> And installing the delay on one server is not going to achieve much.
> 
> It is something that, if it is installed on enough webservers on the Internet, may slow
down the URL-scanning bots (hopefully a lot), and thereby inconvenience their botmasters.

You need to consider the environment that a typical URL scanner runs in, the open internet,
which consists of vast swaths of machines that don't exist and hang when connecting, machines
that are hidden behind firewalls that look like they don't exist, machines that are on slow
internet connections that respond slowly. Bots are already engineered to handle these real
world internet conditions, encountering a slow host is just something bots are used to doing
anyway are are very unlikely to be slowed down because of it. And if they are slowed down,
the bot authors simply treat that problem as a bug and fix it, and continue with what they
are doing.

Regards,
Graham
--


Mime
View raw message