httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dragon <>
Subject Re: [users@httpd] How to rid a pest?
Date Mon, 17 Dec 2007 16:05:59 GMT
Charles Michener wrote:
>I have a couple of spider bots hitting my server that I do not wish 
>to have access to my pages - they ignore robots.txt, so I finally 
>put them on my 'deny from xxxxx' list. This does deny them access 
>but they persist to keep trying - trying each page address at least 
>30 times - several hits per second .  Is there a standard method to 
>forward them to some black hole or the FBI or ...?
---------------- End original message. ---------------------

This is the kind of thing a router/firewall will handle for you.

Stopping these requests before they get to your machine is the best 
way to handle them. Otherwise, it doesn't really have a lot of impact 
on the performance of the server for it to send a forbidden response 
back to the offenders. Yeah, it takes a little bit of processing but 
it is pretty insignificant per request.

Hopefully they will eventually give up but if they don't, look into 
using a firewall to deny at the edge of your network.


  Venimus, Saltavimus, Bibimus (et naribus canium capti sumus)

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message