perl-modperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Boysenberry Payne <boysenbe...@humaniteque.com>
Subject Re: After retrieving data from DB, the memory doesn't seem to be freed up
Date Mon, 14 May 2007 19:48:28 GMT
On May 11, 2007, at 12:05 PM, Michael Peters wrote:

>
> This kind of thing happens all the time. Think of SAX XML parsers  
> or mod_perl
> filters. It's not terribly difficult to parse something in chunks  
> like that.
>
> I wasn't saying that it wouldn't be easier to have everything in  
> memory. Heck
> I'd love it if I never had to read a file in line by line anymore  
> and could just
> slurp them all into arrays. But knowing when to put all of  
> something into memory
> and when not to is part of being a programmer.

Okay, quick question related to this.  How do I send the appropriate  
Content-Length
headers in a situation where I don't read all of the data into memory.

Currently, I do something like this:

my $output = "Some string";
{
	use Apache2::Response();
	use bytes;
	$r->set_content_length(length($output));
}

$r->rflush();
$r->print($output);

If I'm chunking would I re-send the Content-Length header for each  
chunk?
I'm assuming the Bucket Brigade has something to do with this being  
received
by the browser okay...

Boysenberry Payne
boysenberry@habitatlife.com



Mime
View raw message