httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stefan Fritsch>
Subject Re: URL scanning by bots
Date Wed, 01 May 2013 13:03:05 GMT
On Wednesday 01 May 2013, Graham Leggett wrote:
> Of course it might have an effect - the real important question is
> will it have a useful effect.
> A bot that gives up scanning a box that by definition isn't
> vulnerable to that bot (thus the 404) doesn't achieve anything
> useful, the bot failed to infect the host before, it fails to
> infect the host now, nothing has stopped the bot moving to the
> next host and trying it's luck there. Perhaps it does achieve a
> reduction in traffic for you, but that is for you to decide, and
> the tools already exist for you to achieve this.

From my experience, a single bot will often scan for dozens of 
vulnerable php applications. If the delay causes the bot to go away 
before having scanned for all those applications, that may decrease 
the likelihood that forgotten, badly maintained web applications are 
found by bots. This would be a positive effect.

With mpm event, the delay could be done without tying up a worker 
(like mod_dialup). But even with the other mpms, I don't think the 
potential for DoSs would increase iff the 404 delay is kept so low 
that the sum of the delay and the time to read the request headers 
would stay below mod_reqtimeout's header read timeout value.

I would not be against anyone implementing such a delay scheme. But I 
am definitely not volunteering because I am not sure if it would 
actually be worth the effort. And there are other more important 
issues to fix.

View raw message