httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Martijn <>
Subject [users@httpd] setting MaxClients locally?
Date Fri, 08 Jun 2007 09:26:39 GMT

A bit of a long shot... On my website, there is a directory containing
a relatively large number of big files (PDFs). Every now and then,
there is a user that sees them, gets very excited and downloads them
all within a short period of time (probably using FF's DownThemAll
plugin or something similar). Fair enough, that's what they're for,
but, especially if the user is on a slow connection, this will make
them use all available child processes, causing the site to be
unreachable by others, which leads to swapping and, eventually,

I'm looking for a quick, on the fly way to prevent this from happening
(in the long run, the whole server code will be re-written, so I
should be able to use some module - or write one myself). I googled a
bit about limiting the number of child processes per IP address, but
that seems to be a tricky business. Then I was thinking, is there
perhaps a nice way of setting MaxClients 'locally' to a small number,
so that no more than, say, 10 or 20 child processes will be dealing
with requests from a certain directory, while the other processes will
happily be dealing with the rest? E.g. (non-working example!)
something like

MaxClients 100

<Directory /pdf>
LocalMaxClients 20

I know this won't be the nicest solution - it would still prevent
other, non-greedy users to download the PDFs while the greedy person
is leaching the site - but something like this would make my life a
lot easier for the time being. Oh, and perhaps I should add that I
don't really care about bandwidth.

Any ideas?


The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message