httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brian Pane <brian.p...@cnet.com>
Subject Re: Segmentation fault when downloading large files
Date Sun, 01 Sep 2002 18:33:46 GMT
Graham Leggett wrote:

> Peter Van Biesen wrote:
>
>> I now have a reproducable error, a httpd which I can recompile ( it's
>> till a 2.0.39 ), so, if anyone wants me to test something, shoot ! Btw,
>> I've seen in the code of ap_proxy_http_request that the variable e is
>> used many times but I can't seem to find a free somewhere ...
>
>
> This may be part of the problem. In apr memory is allocated from a 
> pool of memory, and is then freed in one go. In this case, there is 
> one pool per request, which is only freed when the request is 
> complete. But during the request, 100MB of data is transfered, 
> resulting buckets which are allocated, but not freed (yet). The 
> machine runs out of memory and that process segfaults. 


But the memory involved here ought to be in buckets (which can
be freed long before the entire request is done).

In 2.0.39 and 2.0.40, the content-length filter's habit of
buffering the entire response would keep the httpd from freeing
buckets incrementally during the request.  That particular
problem is gone in the latest 2.0.41-dev CVS head.  If the
segfault problem still exists in 2.0.41-dev, we need to take
a look at whether there's any buffering in the proxy code that
can be similarly fixed.

Brian




Mime
View raw message