httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Wilson <and...@aaaaaaaa.demon.co.uk>
Subject Re: robot denial
Date Thu, 18 Jul 1996 23:12:46 GMT
> deny agent Crawler/1.4 SiteTrasher/0.0001 Mozilla/42.42
> 
> -=-=-=-=
> 
> What do people think is the best route? I like the latter. Is it posisble
> with the API to write "deny agent" as a module? or is it a patch job?

What return value do you want to give to bad robots?  The easy way
is to make the module detect a baddie and then return SERVER_ERROR
to crash the request so letting the core take care of the fallout.
This could upset sites with funky error handler cgi however so
another option would to be to send out some faked up page with a
more clueful HTTP value than plain old 500

Perhaps you could make it more configurable with a directive to
allow Joe Admin to define his own redirect message...

	RobotRedirect /Errors/bad_robot.html

Which gets sent to those robots that are denied.

> rob

Ay.

Mime
View raw message