httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apa...@aol.com
Subject Re: Compression via content negotiation
Date Wed, 02 Dec 1998 20:05:14 GMT

> Paul Ausbeck wrote...

> It is clear from reading MS doc that they have spent considerable
> effort putting compression into their internet information server...

What document are you referring to? Where is it? Please
supply identity or a LINK to the 'document'. Sometimes what
is 'clear' to one reader of document will not be as 'clear' to
another. Lot of MS documents are like that.

> ...In fact, I believe that the main reason MS put both gzip and
> deflate compression into the browser is to work with
> their server. PNG support is just a sideshow.

If their intent was to fully support the existing Content-Encoding
specification ( as you 'believe' ) and not just the .PNG format
they why does their support for .PNG decompression work and their
support for other Content-Encoding schemata does not? Looks to me
like they just wanted to get .PNG going so they could say it works.

> I am not sure what Netscape has done on their server so I will
> investigate. Probably similar to MS though.

Exactly. Pretty much the same. .PNG works but full support for
all Content-Encoding scenarios is somewhere in the future.

> With regards to 'search engines' 'breaking'...
> All compression schemes of which I am aware make both compressed and
> uncompressed versions available. Returning compressed files if the agent
> has explicitly requested such and uncompressed files otherwise.

Huh? I thought one of the things you were pushing so hard for here
is to have the APACHE SERVER 'default' to sending the GZIP version
automatically even if it is not 'asked' for. That is the scenario
that will break most search engines ( and a lot of other things ).

Roy Fielding is a +1 on 'no defaulting to GZIP' for the same reasons,
if you follow the thread back a little. Too much breakage potential.

> I thought that firewalls operated at the protocol level. I didn't know
> they were concerned with content. If someone knows more on this, please
> post.

A lot of firewalls and their associated products these days depend
on being able to analyze the document as well as
'where it is coming from'. They have tacked a lot of 'filtering'
features onto their products to increase their vertical markets and
these are the features that will probably cause
lots of problems when a 'non-readable' HTML document arrives.
Sure... they could 'decompress' and examine if they know what they
are looking at... but the point is that they don't do it now and
breakage potential is high in this area alone.

> RFC 2068 clearly is written with the idea of html compression in mind.
> Microsoft has both ends of the solution working for their captive
> customers. They are also attempting to get large independent accounts to
> adopt their server side technology. In fact, I know through a friend
> that MS made an impressive compression demonstration at Yahoo over the
> summer.

Anyone can make an 'impressive compression demonstration' when they
get to re-code and re-compile both the server and the client being
used for the 'show'... that doesn't mean that they used 'Content-Encoding'
according to existing specifications or that their support for it is
in compliance with existing RFC's and/or HTML specifications.

Ask your 'friend' if they used 'Content-Encoding' for the 'impressive
compression demonstration' or whether they just brute-forced it.
I'm betting it was the latter for the 'Yahoo' demo.


Mime
View raw message