httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Lopez <dan...@rawbyte.com>
Subject Re: Handling large file (image/video) requests.
Date Tue, 04 Jun 2002 18:31:57 GMT

Yep, that is pretty much it. The problem is also similar to servers running
mod_perl and placing a reverse proxy in the front, to deal with slow clients
without tying the expensive mod_perl processes.
In your case the way to go as you suggest is a separate website with a
stripped down version of Apache that just serves static content. You may
even want to consider other lightweight, threaded webservers such as thttpd.
They may not have all the bells and whistles of Apache, but they scale well
for the kind of situation you describe.


On Tue, Jun 04, 2002 at 02:26:25PM -0400, Eli White wrote:
> I am the webmaster for a popular webserver that can often take MASSIVE 
> requests, and usually when that happens, many are for LARGE files 
> (10-70Mb)  (Hubble Space Telescope Pictures/Movies)
> 
> The problem is, which many of you have probably run into, that each 
> connection allowed must remain locked to that user until the download is 
> complete.  So with my server throttled at 1024 max connections, as soon as 
> 256 people on modems each request 4 70Mb files each, suddenly the server is 
> at max capacity for half-a-day, because it is slowly feeding/waiting for 
> the long download to finish.
> 
> We can't keep just throttling the server higher, because we start running 
> out of RAM, with each Apache process eating into that.  [Yes, I know that 
> Apache 2 might solve some of these problems with it's threaded nature, but 
> we use PHP, and so can't go Apache 2 yet.]
> 
> The only main solution that seems to present itself for these situations is 
> to have a separate image server [Either a different machine, or even just 
> another copy of Apache Running on a different IP on the same 
> machine.  Could even be Apache 2]  And have this copy of Apache compiled to 
> bare minimums to be able to throttle it VERY high, and not eat RAM like 
> there is no tomorrow.
> 
> (Ok, another solution would be server-farm, but we are leaning away from 
> that right now.)
> 
> My main question is:  Is there something that I am missing?  Is there some 
> other way, than relagating image requests to a different server to 'bound' 
> them, to keep the connections open for the 'rest' of the website, and not 
> let just a few large files fill all the connections?
> 
> Thanks,
> Eli
> 
> Space Telescope Science Institute
> http://hubblesite.org/
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
> For additional commands, e-mail: users-help@httpd.apache.org

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Mime
View raw message