httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Moon <a.m...@mdx.ac.uk>
Subject RE: apache 2.0.39 under NT large pdf file failure
Date Fri, 09 Aug 2002 09:46:22 GMT
Sorry guys but i've tried Keepalive off and this has no effect on the 
problem with IE.  Netscape 4.7 seems to behave fine possibly this 
has been improved but i cannot know what browser will be used 
and I have to allow acrobat 4 readers to work. I tried Carries site 
out and had problems there too.  I guessed it might be something 
to do with the keepalive settings and tried a range of experiments 
but none improved the situation.  I think I will have to wind the 
system back to an older version until i can get this problem 
resolved but by the looks/sounds like this is definitely a problem 
with the 2.0.39 build so maybe someone will spot the problem - its 
a shame because other than this 2.0.39 seemed to work very well 
and i was quite pleased with the setup.

Before i do this backward step - is there a way i could force pdf 
files to be opened independently by acrobat reader rather than from 
within the browser - perhaps through mime type or mime magic?  
This seems to be the difference between the browsers that work 
and those that don't.  I.e. if the acrobat reader opens into its own 
seperate window   it works if it opens within the browser it fails. By 
this i don't mean a new browser window by the way but getting the 
reader to run seperately from the browser. Therefore if i could force 
it to open into its own environemnt space it might solve the problem 
in a botched way.  I know the reader that will be used either 
acrobat reader 4 or 5 because it is an intranet. I hate pdf files at the 
moment - they sometimes appear with error 109!  It seems to me 
that somewhere somthing is being added to the PDFs that make 
them unreadable.  
  
Any further ideas greatly appreciated - i set up an experimental  
2.0.39 and keep it  running until this is resolved then i can bring the 
live system back on.

Alex

On 9 Aug 02, at 10:14, Boyle Owen wrote:

> >From: Carrie Salazar [mailto:salazar@nature.berkeley.edu]
> >
> >hi Alex,
> >i did a google search on the problem and all i could come up
> >with was a conversation in comp.infosystems.www.servers.unix
> >(not .windows i know) that mentions a similar problem.  the 
> >url is humongous so the gist of the talk was:
> >"the problem is that the combination of IE 5.5 (W2k) and the
> >Acrobat 4.0 IE plugin causes PDF display problems when keepalive is
> >turned on: pages often turn out blank. Since keepalive is generally
> >good, we only wanted to turn it off for PDFs" and "The reason for 
> >this behaviour is chunked encoding bound to persistent connections. 
> >From this point of view disabling KeepAlive is a workaround some 
> >way".  the "solution" was to add "SetEnvIf Request_URI 
> >"\.pdf$" nokeepalive" 
> >in the apache.conf file.  
> >
> >does this make sense to anyone?
> 
> If I can try to get my head around this...
> 
> The original HTTP spec was a simple request-response protocol: The client made a single
request a
nd the server responded with the data then closed the connection. However, this lead to a
lot of re
quests when, say, the client gets a page then finds it has 10 images in it - the client then
has to
 make 10 more separate requests for each image.
> 
> The "KeepAlive" directive allowed apache to hold the connection open after sending the
first resp
onse in case any subsequent responses came in. The "MaxKeepAliveRequests" directive limited
this be
haviour so as not to hang up the server too much. This type of behaviour is default in HTTP/1.1
and
 is called "persistent connection".
> 
> So far, so good...
> 
> When one machine sends data to another over TCP/IP, it is required that the server tell
the clien
t how much data is coming so that the client can prepare a buffer for it. However, sometimes
(e.g. 
CGI output) the server doesn't know the amount of data in advance. So HTTP/1.1 allows "chunked
enco
ding" - the server takes the data stream from the generating processes and, when it has a
reasonabl
e amount, calls this a "chunk" and sends it to the client. Thus the client receives a series
of chu
nks of arbitrary size (not all necessarily equal) which it reassembles to produce the final
file.  

> 
> What the note above is saying is that the two processes don't work so well together with
certain 
clients (in this case IE 5.5 + Acrobat 4.0). I'm not sure why the server should send a PDF
using ch
unked-encoding since it *does* know the size in advance. Perhaps chunked-encoding has become
the st
andard way of transferring large data sets simply because it is easier to send lots of small
files 
than one large one (after all, that is what TCP/IP really does underneath HTTP).
> 
> To understand the problem fully, you'd need to sniff the packets going between client
and server 
during the exchange but for the moment, it would seem that switching off KeepAlive for PDFs
might f
ix it...
> 
> Rgds,
> 
> Owen Boyle
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
> For additional commands, e-mail: users-help@httpd.apache.org
> 



Technical Manager
Online Learning Support Unit
Middlesex University Business School

a.moon@mdx.ac.uk
020 8411 5092


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Mime
View raw message