httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Laura Vance <>
Subject Re: [users@httpd] limit number of CGI processes
Date Mon, 19 Apr 2004 17:01:00 GMT
Hi Florian,

I ran into this same problem, and any time that you're working with an 
interpreted language, the system has to allocate separate resources for 
separate instances of the program.  The difference is that I was using 
Perl instead of PHP, but the problem is the same.

The way I overcame this problem was to code the software in C++.  That 
way the interpretation overhead dropped to 0.

My scenario was as follows:
approximately 400 computers (all using the same Perl application at the 
same time)
of those, around 40 or 50 would click on a submit button simultaneously 
(or in rapid succession)
Each Perl process takes an average of 2 seconds to run (this is just for 
loading the interpreter and running through a very basic program, 
increase the time if the program has any complexity or database calls)
the server processor would spike and memory usage would spike. (the 
interpreter takes a LOT of RAM compared to the code)
If the number of processes was high enough, the server would crash, 
because it would run out of swap space.
As the server slowed to handle the increased number of processes, more 
would get backlogged... taking more memory and processor time.

As you can see, this is a classic cascading failure, because the 
slowdown of the original load gives time for more requests to come in.

After I rewrote the program in C++ and started testing the CPU load, I 
couldn't make enough simultaneous requests to force a noticable 
difference in CPU load or memory usage.  Then when I put the rewritten 
software online with all 400 users, there was never a noticable slowdown 
in system performance.

I mainly use Perl for fast program writing / testing / debugging, but 
when it's time for the app to be unleashed, it finds its way into C++ 
soon after implementation.

Florian Effenberger wrote:

>Hi there,
>I run PHP as CGI (because of suEXEC), but some configuration must be wrong.
>I just tried out to reload a PHP generated website about 20 or 30 times in
>my browser, and this really bogged down the server, I had a load of about 20
>or 30.
>Is there anything I can do to limit this risk? I already fiddled around with
>some configuration variables, but it didn't help. It always created a whole
>lot of CGI childs that used up all memory...
>I run Apache 2.0 on Linux 2.4.
>The official User-To-User support forum of the Apache HTTP Server Project.
>See <URL:> for more info.
>To unsubscribe, e-mail:
>   "   from the digest:
>For additional commands, e-mail:

Laura Vance
Systems Engineer
Winfree Academy Charter Schools, Data-Business Office
1711 W. Irving Blvd. Ste 310
Irving, Tx  75061

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message