incubator-ooo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Herbert Duerr <...@apache.org>
Subject Re: Wiki robots.txt
Date Mon, 22 Oct 2012 09:29:18 GMT
On 21.10.2012 15:13, imacat wrote:
>      I found the following rule in the robots.txt of our wiki:
>
> User-Agent: *
> Disallow: /
>
>      Does any know if there is any special reason why it is set so?  Does
> any have any reason to keep it?  I'm thinking of removing this rule.

+1, blocking all search robots makes no sense.
Google etc. are also much more successful in finding relevant results to 
non-trivial searches. The wiki-builtin search had many problems [1], 
many of which are fixed in the meantime though.

[1] http://www.mediawiki.org/wiki/Search_issues

Herbert

Mime
View raw message