httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joshua Slive <>
Subject Re: Possible rehash of a question, but looking for a different solution
Date Thu, 07 Mar 2002 17:08:11 GMT

On Thu, 7 Mar 2002, Aaron wrote:

> I have seen many ways using cgi scripts and User-Agent variables to block
> spiders from an Apache web site.
> I am looking for something a little more complex that can be done in the
> httpd.conf file (so that it is not dependant on anything external) which will
> detect when a user has accessed a spam-bot/spider page, and will then set a
> variable which will follow them throught their sessions.

There is no such thing as a "session" in HTTP, so I think you need to
refine your requirements a little.  You can create sessions with cookies,
but obviously that will not help for a spider that doesn't accept cookies.
You could also block all requests from a specific IP address.  But then
you may block other users of the same proxy.

In other words, this is a very difficult problem, so you need to be very
specific about what you are trying to do.


The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message