httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reindl Harald <h.rei...@thelounge.net>
Subject Re: URL scanning by bots
Date Wed, 01 May 2013 10:53:36 GMT


Am 01.05.2013 11:37, schrieb Ben Laurie:
>> Well, no, actually this is not accurate. You are assuming that these
>> bots are written using blocking io semantics; that if a bot is delayed
>> by 2 seconds when getting a 404 from your server, it is not able to do
>> anything else in those 2 seconds. This is just incorrect.
>> Each bot process could launch multiple requests to multiple unrelated
>> hosts simultaneously, and select whatever ones are available to read
>> from. If you could globally add a delay to bots on all servers in the
>> world, all the bot owner needs to do to maintain the same throughput
>> is to raise the concurrency level of the bot's requests. The bot does
>> the same amount of work in the same amount of time, but now all our
>> servers use extra resources and are slow for clients on 404.
> 
> So your argument is that extra connections use resources in servers
> but not clients?

no - his argument was he has one server crippled down by the proposal
of the OP and the botnet has thousands of clients and only one job
while you have on your server PRIMARY the job to serve clients and
not playing games with ressources


Mime
View raw message