httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eli White <>
Subject Handling large file (image/video) requests.
Date Tue, 04 Jun 2002 18:26:25 GMT
I am the webmaster for a popular webserver that can often take MASSIVE 
requests, and usually when that happens, many are for LARGE files 
(10-70Mb)  (Hubble Space Telescope Pictures/Movies)

The problem is, which many of you have probably run into, that each 
connection allowed must remain locked to that user until the download is 
complete.  So with my server throttled at 1024 max connections, as soon as 
256 people on modems each request 4 70Mb files each, suddenly the server is 
at max capacity for half-a-day, because it is slowly feeding/waiting for 
the long download to finish.

We can't keep just throttling the server higher, because we start running 
out of RAM, with each Apache process eating into that.  [Yes, I know that 
Apache 2 might solve some of these problems with it's threaded nature, but 
we use PHP, and so can't go Apache 2 yet.]

The only main solution that seems to present itself for these situations is 
to have a separate image server [Either a different machine, or even just 
another copy of Apache Running on a different IP on the same 
machine.  Could even be Apache 2]  And have this copy of Apache compiled to 
bare minimums to be able to throttle it VERY high, and not eat RAM like 
there is no tomorrow.

(Ok, another solution would be server-farm, but we are leaning away from 
that right now.)

My main question is:  Is there something that I am missing?  Is there some 
other way, than relagating image requests to a different server to 'bound' 
them, to keep the connections open for the 'rest' of the website, and not 
let just a few large files fill all the connections?


Space Telescope Science Institute

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message