httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matt <>
Subject [users@httpd] Allowing Robots.txt
Date Mon, 10 Oct 2011 16:45:29 GMT
In .htaccess I have something like this:

order allow,deny

allow from 192.168.x.x/24

<Files robots.txt>
allow from all

RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteCond %{REQUEST_URI} /
RewriteRule ^(.*)$$1 [R,L]

AuthName "Restricted Area"
AuthType Basic
AuthUserFile /var/www/.htpasswd
AuthGroupFile /dev/null
require valid-user

I want to restrict http access to the server to certain subnets,
require SSL and a username and password.  The exception is the
robots.txt file.  I want to allow anyone access to that.  How do I
tell it not to enforce a password or SSL only on robots.txt?

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message