httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rainer Jung <rainer.j...@kippdata.de>
Subject Problem with file descriptor handling in httpd 2.3.1
Date Sat, 03 Jan 2009 23:16:09 GMT
During testing 2.3.1 I noticed a lot of errors of type EMFILE: "Too many 
open files". I used strace and the problem looks like this:

- The test case is using ab with HTTP keep alive, concurrency 20 and a 
small file, so doing about 2000 requests per second. 
MaxKeepAliveRequests=100 (Default)

- the file leading to EMFILE is the static content file, which can be 
observed to be open more than 1000 times in parallel although ab 
concurrency is only 20

- From looking at the code it seems the file is closed during a cleanup 
function associated to the request pool, which is triggered by an EOR bucket

Now what happens under KeepAlive is that the content files are kept open 
longer than the handling of the request, more precisely until the 
closing of the connection. So when  MaxKeepAliveRequests*Concurrency > 
MaxNumberOfFDs we run out of file descriptors.

I observed the behaviour with 2.3.1 on Linux (SLES10 64Bit) with Event, 
Worker and Prefork. I didn't yet have the time to retest with 2.2.

For Event and Worker I get also crashes (more precisely httpd processes 
stopping) due to apr_socket_accept() also returning with EMFILE.

Regards,

Rainer


Mime
View raw message