httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aaron" <micro...@microchp.org>
Subject Possible rehash of a question, but looking for a different solution
Date Thu, 07 Mar 2002 16:46:55 GMT
I have seen many ways using cgi scripts and User-Agent variables to block
spiders from an Apache web site.

I am looking for something a little more complex that can be done in the
httpd.conf file (so that it is not dependant on anything external) which will
detect when a user has accessed a spam-bot/spider page, and will then set a
variable which will follow them throught their sessions.

I tried playing around with Request_URI but that is only 'per-request'.

The reason that I do not rely upon User-Agent, is that most mirroring programs
now allow people to simply mimic a normal web browser.

Does anyone know of a way within 'Any' apache install, wether it be on Linux,
BSD, Win2k, AIX or whatever, that I can trap users which hit a page that would
not otherwise be accessed by normal browsing?

I apogize if this has already been beaten to death, though I did not see
anything quite to this detail in the archives.

Thanks,

Aaron

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Mime
View raw message