perl-embperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Neil Gunton <n...@nilspace.com>
Subject Re: Gerald: please update module to include patch for CGI.pm 3.38+
Date Fri, 18 Sep 2009 16:10:59 GMT
Chris Denman wrote:
> I made a utility for Twitter that went viral for a week - my server
> took 30,000 hits in a few hours.  Of course, it crashed.  I went away
> and tried everything to get it working under this sort of load.  Thing
> is, the traffic slowed and I'll never know if my new setup with hold
> up.  Can I expect, say, 200 concurrent users on a Embperl website with
> only 1Mb of Ram?  If so, what's the best Apache configuration?

Does every single request need to be handled by Embperl, or could you 
perhaps cache some of the pages? I use a two level caching reverse proxy 
Apache configuration on my server. There are basically two installations 
of Apache - one with mod_proxy, which serves all static content (images 
mostly). It passes all requests for dynamic content to the "backend" 
Apache process (running on the same server) which has mod_perl and 
embperl. The front end apache can be very lightweight, as it doesn't 
have mod_perl etc it doesn't use much memory, and so you can have a 
whole bunch of them running. And if you use mod_cache, then you can also 
cache the results that come from the backend embperl server. So that 
means that if you're careful about setting expiration times for your 
embperl content, then you can substantially lighten the load, and still 
be able to handle a whole bunch of concurrent users without breaking a 
sweat.

You usually find that not every single "dynamic" page actually has to be 
generated anew for every single request. For example, on my community 
website, there are pages which are "what's new" indexes of content. 
These are set to have a cache expiry of one minute. So if a whole bunch 
of people hit that page, most of those requests will be served by the 
front end, until that version expires (next minute). So the backend is 
only hit for that page once per minute. It gets a bit more complicated 
if you have users with cookies, but basically I only set 'no-cache' for 
those parts of the site that are definitely needing Embperl per-request 
(e.g. editing forms, user-specific areas etc).

Caching is how you handle scaling - most websites go down because they 
handle everything with the dynamic engine, and then one day they get hit 
by a bunch of people and "bang". Early on I had a couple of my articles 
go to the front page of slashdot. As soon as I put in the caching 
architecture, my server was able to handle slashdottings without 
breaking a sweat. I'd get upward of 40,000 hits within a few hours, and 
the server was just basically idling along, since most of the hits were 
effectively being served as static content from cache - even though the 
article was dynamically generated by embperl.

> Can't wait for any new versions....!

Me too - I'm so glad to see Gerald back on the list. I have based my 
entire website development around Embperl (hundreds of thousands of 
lines of code now, and going on for ten years of development). I was 
afraid that Embperl was abandonware, since I'd seen the embperl list 
traffic dwindle and questions go unanswered. I really hope to see 
embperl rekindled. If/when my business takes off (of course it will, any 
day now!) then I hope to be able to send some appreciation Gerald's way.

Neil
http://www.neilgunton.com/

---------------------------------------------------------------------
To unsubscribe, e-mail: embperl-unsubscribe@perl.apache.org
For additional commands, e-mail: embperl-help@perl.apache.org


Mime
View raw message