httpd-docs mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael.Schro...@telekurs.de
Subject Antwort: Re: links to english originals
Date Wed, 25 Sep 2002 19:03:23 GMT

Hi Joshua,


>> I actually would prefer a SSI
> I'd really prefer not to have the parse all the docs for SSI just for
> this little thing.

is this a performance issue on the server, or do you have
fundamental reasons against using SSI in this situation?

>> or better a post processing
>> solution (for example a perl script similiar to expand.pl).
> That is a possibility.  I would want to use a proper xml parser
> (possibly from within perl) and not just a search/replace thing.
>
> Anyway, lets think about the problem a little more before we try
> anything.  That way we can hopefully come up with a solution that
> will help us in other areas as well.

I have a case that looks similar in some aspects, so it might
add to the discussion if I just describe what I am doing there.

My mod_gzip pages
     http://www.schroepl.net/projekte/mod_gzip/
contain some features that are implemented via SSI:
a) I want to maintain one navigation menu for all pages
b) I want the navigation button for the currently viewed
   page to be different from the read (i. e. highlighted
   and not clickable).
The first thing is done via SSI
     <!--#include virtual -->;
the second one is done via XSSI
     <!--#if expr="${REQUEST_URI} = /filename.htm/" -->.     <!--#else -->
     <!--#endif -->

But like you, I don't want the SSI files to be interpreted
for each and every access - I want to use the results of
the SSI process as static files (that can be stored in com-
pressed form and all that).

So I wrote a simple Perl script that is parsing the direc-
tory where the SSI files are located, adds some "http:"
prefix to these file names, executes some HTTP request for
the URL constructed this way (using LWP::Simple::getstore)
and save the result into a file that will have one level of
name extension less than the original file.

Thus I "compile" the SSI files - and then I upload the resul-
ting files to the server. These file do still have extensions
that make the server perform language negotiation for them ...

Regards, Michael




#!/usr/bin/perl
#################################################
### generate static HTML files from SSI files ###
#################################################

  use strict;
  use LWP::Simple;

  my $root_path = @ARGV[0];
  my $root_url  = @ARGV[1];

  opendir (DIR, '.');
  my @entries = readdir (DIR);
  foreach my $this_entry (@entries)
          {

            # generate a "*" file for each "*.shtml" there
              if ($this_entry =~ /^^(.*)\.shtml$/)
                 {

                   # isolate resulting file name
                     my $file_truename = $1;

                   # form corresponding path name
                     my $new_path = "$root_path/$file_truename";
                     print "generating $new_path ...\n";
                   # HTTP-GET the corresponding URL content
                     LWP::Simple::getstore
("$root_url/$file_truename.shtml", $new_path);

                 }

          }
# (I bet Andy can reduce this by another 50% ;-)



makehtml.bat:
     perl makehtml.pl n:/www/projekte/mod_gzip
http://localhost/projekte/mod_gzip.src/ssi



---------------------------------------------------------------------
To unsubscribe, e-mail: docs-unsubscribe@httpd.apache.org
For additional commands, e-mail: docs-help@httpd.apache.org


Mime
View raw message