perl-modperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adam Prime <adam.pr...@utoronto.ca>
Subject Re: huge httpd processes
Date Wed, 30 Sep 2009 14:04:58 GMT
Justin Wyllie wrote:
> Hi clint
> 
> Yes. Linux and this script looks good. We've think that part of the problem
> is in the modules Apache is loading so this will be useful.
> 
> I also have another couple of questions:
> 
> I have found the errant code where our process jumps by 13 Mbs. One part
> does something like this:
> 
> $file_handle->read($s, $length); #$s is  about 1/2 Mb
> @data = unpack($format , $s);
> ##at this point memory usage jumps by 8 Mbs (measured using GTop->size() )

As Clinton said, perl doesn't free the memory back to the OS when you 
slurp this file into ram.  If you really want to free up the resources 
(which will get reused by subsequent requests, they just aren't 
available to the OS) you can use $r->child_terminate to make that child 
die after handling that request, which will free the resources (and 
likely spawn another child in it's place.

Adam

Mime
View raw message