httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "William A. Rowe, Jr." <wr...@rowe-clan.net>
Subject Re: Segmentation fault when downloading large files
Date Wed, 28 Aug 2002 12:34:48 GMT
At 07:06 AM 8/28/2002, Graham Leggett wrote:
>Peter Van Biesen wrote:
>
>>>>Program received signal SIGSEGV, Segmentation fault.
>>>>0xc1bfb06c in apr_bucket_alloc () from /opt/httpd/lib/libaprutil.sl.0
>>>>(gdb) where
>>>>#0  0xc1bfb06c in apr_bucket_alloc () from
>>>>/opt/httpd/lib/libaprutil.sl.0
>>>>#1  0xc1bf8d18 in socket_bucket_read () from
>>>>/opt/httpd/lib/libaprutil.sl.0
>>>>#2  0x00129ffc in core_input_filter ()
>>>>#3  0x0011a630 in ap_get_brigade ()
>>>>#4  0x000bb26c in ap_http_filter ()
>>>>#5  0x0011a630 in ap_get_brigade ()
>>>>#6  0x0012999c in net_time_filter ()
>>>>#7  0x0011a630 in ap_get_brigade ()
>
>The ap_get_brigade() is followed by a ap_pass_brigade(), then a 
>apr_brigade_cleanup(bb).
>
>What could be happening is that either:
>
>a) brigade cleanup is hosed or leaks
>b) one of the filters is leaking along the way

Or it simply tries to slurp all 100's of MBs of this huge download.

As I guessed, we are out of memory.

Someone asked why I asserted that input filtering still sucks.  Heh.

Bill


Mime
View raw message