httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From TOKI...@aol.com
Subject How will filtering deal with foo.htm.gz files?
Date Tue, 29 Aug 2000 04:06:43 GMT

Adding all the ZLIB and Content-decoding stuff to ApacheBench
and testing it against a ton of static xxxxxx.htm.gz compressed
files got me wondering...

How is any filtering scheme going to deal with pre-compressed
versions of requested objects?

If, at any particular site, there is supposed to be some important
outbound filtering going on for all deliverables then what if the 
content negotiation finds an already compressed version of the entity
buried down inside someone's virtual host home dir or whatever?

Is the filtering engine going to DECOMPRESS it just so it can pass through
whatever 'high priority' filters are in place or is it just going to 'force it
through' untouched by any filter?

If the 'forced passthru' option will rule... then how are the installed 
filters going to be told to not touch the output? By mime type?

Just curious. I don't have a good answer other than requiring the
filtering engine to decompress it, pass it through the installed
filters, and then compress it again in the same encoding format
before it goes out the door.

Another interesting scenario that this brings up is that perhaps
anyone who writes an insertable real-time filter must also write
the static stand-alone version of the same filter so that it can
be applied to static pre-compressed files that will be sitting
on that Server.

In other words... if Apache has no good way to perform the
filtering in real-time because it can't unpack the requested
object then all the important filtering on that site will have
to be applied offline and statically to the pre-compressed file(s).

Yours...
Kevin Kiley
CTO, Remote Communications, Inc.
http://www.RemoteCommunications.com
http://www.rctp.com - Online Internet Content Compression Server.

Mime
View raw message