Received: by taz.hyperreal.com (8.6.12/8.6.5) id HAA26293; Fri, 19 Jul 1996 07:02:50 -0700 Received: from sierra.zyzzyva.com by taz.hyperreal.com (8.6.12/8.6.5) with ESMTP id HAA26285; Fri, 19 Jul 1996 07:02:46 -0700 Received: from zyzzyva.com (localhost [127.0.0.1]) by sierra.zyzzyva.com (8.7.5/8.6.11) with ESMTP id JAA07268 for ; Fri, 19 Jul 1996 09:02:44 -0500 (CDT) Message-Id: <199607191402.JAA07268@sierra.zyzzyva.com> To: new-httpd@hyperreal.com Subject: Re: robot denial In-reply-to: Dirk.vanGulik's message of Fri, 19 Jul 1996 09:19:13 +0200. <9607190719.AA28797@ jrc.it> X-uri: http://www.zyzzyva.com/ Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Date: Fri, 19 Jul 1996 09:02:44 -0500 From: Randy Terbush Sender: owner-new-httpd@apache.org Precedence: bulk Reply-To: new-httpd@hyperreal.com > > > But perhaps it might be better to extend the authoriazation stuff to > > allow something like > > > > > > order allow,deny > > allow from all > > deny agent /robot > > > > > > which would also allow denial to individual user agents e.g. > > > > deny agent Crawler/1.4 SiteTrasher/0.0001 Mozilla/42.42 > > Excelent idea ! > > I really like the latter; if no-one else goes for it I'll write > a module which does. > > Dw. I assume that we could still use: deny from 1.2.3.4 deny agent Crawler/1.4 SiteTrasher/0.0001 Mozilla/42.42 to only restrict agents from that address?