httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lazy <lazy...@gmail.com>
Subject Re: URL scanning by bots
Date Tue, 30 Apr 2013 18:54:47 GMT
2013/4/30 Graham Leggett <minfrin@sharp.fm>

> On 30 Apr 2013, at 12:03 PM, André Warnier <aw@ice-sa.com> wrote:
>
> > The only cost would a relatively small change to the Apache webservers,
> which is what my
> > suggestion consists of : adding a variable delay (say between 100 ms and
> 2000 ms) to any
> > 404 response.
>
> This would have no real effect.
>
> Bots are patient, slowing them down isn't going to inconvenience a bot in
> any way. The simple workaround if the bot does take too long is to simply
> send the requests in parallel. At the same time, slowing down 404s would
> break real websites, as 404 isn't necessarily an error, but rather simply a
> notice that says the resource isn't found.
>
> Regards,
> Graham
> --
>
>
If you want to slow down the bots I whould suggest using

mod_security + simple scripts+ ipset + iptables TARPIT in the raw table

this way You would be able to block efficiently a very large number of
ipnumbers, using
TARPIT will take care of the
delaying new bot connections at minimal cost (much lower then delaying the
request in userspace, or even returning some error code)

http://ipset.netfilter.org/
http://serverfault.com/questions/113796/setting-up-tarpit-technique-in-
iptables
http://www.modsecurity.org/documentation/modsecurity-apache/1.9.3/html-
multipage/05-actions.html
-- 
Michal Grzedzicki

Mime
View raw message