httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Graham Leggett <minf...@sharp.fm>
Subject Re: Segmentation fault when downloading large files
Date Mon, 02 Sep 2002 06:22:46 GMT
Brian Pane wrote:

> But the memory involved here ought to be in buckets (which can
> be freed long before the entire request is done).
> 
> In 2.0.39 and 2.0.40, the content-length filter's habit of
> buffering the entire response would keep the httpd from freeing
> buckets incrementally during the request.  That particular
> problem is gone in the latest 2.0.41-dev CVS head.  If the
> segfault problem still exists in 2.0.41-dev, we need to take
> a look at whether there's any buffering in the proxy code that
> can be similarly fixed.

The proxy code doesn't buffer anything, it basically goes "get a bucket 
from backend stack, put the bucket to frontend stack, cleanup bucket, 
repeat".

There are some filters (like include I think) that "put away" buckets as 
the response is handled, it is possible one of these filters is also 
causing a "leak".

Regards,
Graham
-- 
-----------------------------------------
minfrin@sharp.fm 
	"There's a moon
					over Bourbon Street
						tonight..."


Mime
View raw message