httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nick Kew <>
Subject Re: [users@httpd] dealing with CPU hogs
Date Fri, 06 Jul 2007 22:12:45 GMT
On Fri, 6 Jul 2007 16:30:32 -0400
"Tony Rice \(trice\)" <> wrote:

> Any suggestions on configuration changes I can make to lessen the
> impact of CGI scripts which become CPU hogs? 
> I'm running an apache server with about 150 virtual servers.
> Ocassionally an errant script will go nuts and consume 100% of the
> CPU.  It's really bad when a spider finds one of these poorly written
> scripts and calls it a few thousand times.

Two suggestions:

(1) Use ulimit to limit what a CGI script can have.
(2) Use mod_load_average (google for where it is) to refuse to
run the bad scripts, or refuse to run any script, when the server
is busy (as measured by system load average).

If you can identify bad spiders reliably, e.g. by IP address or
User-Agent, you can block specifically that.  There's a mod_rewrite
recipe for that, or there's mod_robots, which I hacked up quickly
to deal with a particular robot that was causing a similar problem.

>	 Any other suggestions?  Can consistent
> connections be throttled on a per client IP address basis?   Can
> server processes be controlled on a per virtual server basis?

Take a look at for bandwidth-limiting modules.
But they really solve a different problem.

Nick Kew

Application Development with Apache - the Apache Modules Book

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message