httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Juan E Suris" <>
Subject [users@httpd] httpd process using too much memory
Date Sat, 06 Mar 2004 23:10:54 GMT
Hi All!

I have searched through the documentation and can't really find a solution to my problem and
I can't seem to find anybody that knows what to do.

My problem is that I have a perl cgi (on linux, apache 2.0) that generates very large files,
that are sent to the browser directly form the perl script (using "Content-Dispositon: attachment",,
etc..). This in turn causes the httpd process serving the request to use up alot of memory
(about the size of the file being served). Is there any way to limit the amount of memory
the process will use up? It seems like Apache does not block the output of the script, so
that it will keep up with the speed the client can receive it, and therefore has to store
it temporarily in memory(correct?). 

A solution that has been proposed is to write the file to the filesystem and redirect the
request to that file. I am really trying to avoid doing this because the I/O generated by
writing and reading the file to/from the filesystem will kill my app.

Thanks in advance for any help.
View raw message