perl-modperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Perrin Harkins" <>
Subject Re: few newbie quesitons..
Date Mon, 07 May 2007 04:17:03 GMT
On 5/7/07, James. L <> wrote:
> the files i need to parse are usually in size of 2M -
> 10M. will the mod_perl server(2G Mem) use up the
> memory pretty quick after few hundred requests on
> different files?

You're misunderstanding a little bit.  It's not that the memory used
in parsing a file gets lost permanently.  Instead, the variable that
you loaded the data holds onto the memory from the largest size it got

> sub parse {
>   my ($class,$file) = @_;
>   my @data;
>   open my $F, $file or die $!;
>   while ( my $line = <$F> ) {
>     my @fields = split /=/, $line;
>     push @data, \@fields;
>   }
>   close $F;
>   return \@data;
> }

If you read enough data into @data to use up 20MB, it will stay that
size.  That's a good thing if you intend to read another file of
similar size on the next request.  This would only be bad if you read
a very large amount of data in but only now and then.

The best way to avoid this kind of problem is to not read the whole
thing into RAM.  You can pass an iterator object to TT instead of
loading all the data at once.

- Perrin

View raw message