httpd-cvs mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
Subject cvs commit: httpd-docs-1.3/htdocs/manual/misc howto.html compat_notes.html
Date Mon, 06 Nov 2000 22:45:42 GMT
slive       00/11/06 14:45:37

  Modified:    htdocs/manual/misc howto.html compat_notes.html
  Grammar/Style fixes.
  Submitted by:	Chris Pepper <>
  Revision  Changes    Path
  1.13      +7 -8      httpd-docs-1.3/htdocs/manual/misc/howto.html
  Index: howto.html
  RCS file: /home/cvs/httpd-docs-1.3/htdocs/manual/misc/howto.html,v
  retrieving revision 1.12
  retrieving revision 1.13
  diff -u -r1.12 -r1.13
  --- howto.html	1999/07/30 09:51:01	1.12
  +++ howto.html	2000/11/06 22:45:25	1.13
  @@ -57,11 +57,11 @@
   RewriteRule /.* [R]
  -This will send an HTTP 302 Redirect back to the client, and no matter
  +will send an HTTP 302 Redirect back to the client, and no matter
   what they gave in the original URL, they'll be sent to
  -The second option is to set up a <CODE>ScriptAlias</CODE> pointing to
  +<p>The second option is to set up a <CODE>ScriptAlias</CODE> pointing
   a <STRONG>CGI script</STRONG> which outputs a 301 or 302 status and the
   of the other server.</P>
  @@ -89,7 +89,7 @@
         "Location:\r\n" .
  @@ -118,7 +118,6 @@
   mv access_log access_log.old<BR>
   kill -1 `cat`
   <P>Note: <CODE></CODE> is a file containing the
  @@ -135,7 +134,7 @@
   <CODE>robots.txt</CODE> which you don't have, and never did have?</P>
   <P>These clients are called <STRONG>robots</STRONG> (also known as crawlers,
  -spiders and other cute name) - special automated clients which
  +spiders and other cute names) - special automated clients which
   wander around the web looking for interesting resources.</P>
   <P>Most robots are used to generate some kind of <EM>web index</EM> which
  @@ -155,7 +154,7 @@
   <P>Another reason some webmasters want to block access to robots, is to
   stop them indexing dynamic information. Many search engines will use the
  -data collected from your pages for months to come - not much use if your
  +data collected from your pages for months to come - not much use if you're
   serving stock quotes, news, weather reports or anything else that will be
   stale by the time people find it in a search engine.</P>
  @@ -194,7 +193,7 @@
   Requests on port 80 of my proxy <SAMP>nicklas</SAMP> are forwarded to
  -proxy<SAMP></SAMP>, while requests on port 443 are
  +<SAMP></SAMP>, while requests on port 443 are
   forwarded to <SAMP></SAMP>.
   If the remote proxy is not set up to
   handle port 443, then the last directive can be left out. SSL requests
  1.27      +5 -5      httpd-docs-1.3/htdocs/manual/misc/compat_notes.html
  Index: compat_notes.html
  RCS file: /home/cvs/httpd-docs-1.3/htdocs/manual/misc/compat_notes.html,v
  retrieving revision 1.26
  retrieving revision 1.27
  diff -u -r1.26 -r1.27
  --- compat_notes.html	2000/09/18 02:09:46	1.26
  +++ compat_notes.html	2000/11/06 22:45:30	1.27
  @@ -22,7 +22,7 @@
   due to the fact that the parser for config and access control files
   was rewritten from scratch, so certain liberties the earlier servers
   took may not be available here.  These are all easily fixable.  If you
  -know of other non-fatal problems that belong here, <A
  +know of other problems that belong here, <A
   HREF="">let us know.</A>
   <P>Please also check the <A HREF="known_client_problems.html">known
  @@ -43,7 +43,7 @@
   <LI>If you follow the NCSA guidelines for setting up access
       restrictions based on client domain, you may well have added
  -    entries for, <CODE>AuthType, AuthName, AuthUserFile</CODE> or
  +    entries for <CODE>AuthType, AuthName, AuthUserFile</CODE> or
       <CODE>AuthGroupFile</CODE>.  <STRONG>None</STRONG> of these
       needed (or appropriate) for restricting access based on client
       domain.  When Apache sees <CODE>AuthType</CODE> it (reasonably)
  @@ -57,7 +57,7 @@
   <LI><CODE>exec cgi=""</CODE> produces reasonable <STRONG>malformed
     header</STRONG> responses when used to invoke non-CGI scripts.<BR>
  -  The NCSA code ignores the missing header. (bad idea)
  +  The NCSA code ignores the missing header (bad idea).
     <BR>Solution: write CGI to the CGI spec and use
     <CODE>include&nbsp;virtual</CODE>, or use <CODE>exec cmd=""</CODE>
  @@ -90,7 +90,7 @@
       booting unless an added <CODE>optional</CODE> keyword is included.
  -<LI>Apache does not implement <CODE>OnDeny</CODE> use
  +<LI>Apache does not implement <CODE>OnDeny</CODE>; use
       <A HREF="../mod/core.html#errordocument"><CODE>ErrorDocument</CODE></A>
  @@ -115,7 +115,7 @@
   <LI>Apache does not allow ServerRoot settings inside a VirtualHost
       container.  There is only one global ServerRoot in Apache; any desired
       changes in paths for virtual hosts need to be made with the explicit
  -    directives, eg. DocumentRoot, TransferLog, <EM>etc.</EM>
  +    directives, <em>e.g.</em>, DocumentRoot, TransferLog, <EM>etc.</EM>
   <LI>The <CODE>AddType</CODE> directive cannot be used to set the type

View raw message