cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "mr.abanjo" <mr.aba...@gmail.com>
Subject Extract large data from database
Date Wed, 18 Mar 2009 11:14:33 GMT
Hi,i'm developing an application that must load a large ammount of data
(100.000 records) and stream them out  from a servlet.

I created a "for" cycle that for every step load 200 record from the
database (using datacontext.performquery method), and then flush the data in
the serlvet output stream.
In this way i suppose that the max ammount of memory allocated by cayenne is
:

object size * 200

but seems that this is not true. When i try to get the data from the
servlet, after few seconds, i receive an "out of memory" error. (my 500Mb
allocated for the Heap are reached in a few time... :-(....)

Which is the best practice in this case?
I can increase the heap, but i want to find a workaround that allow me to
have a small heap allocated during this operation.

I found this article:
http://cayenne.apache.org/doc20/iterating-through-data-rows.html
Is this the right way?

The servlet i'm implementig will be called by a different webapp
(backoffice.. in a server 2 server way ) with a low traffic but i'm warried
about the warning reported in the article:

"In web applications, programmers must ensure that no open ResultIterators
are kept between HTTP requests. Failure to do so may result in too many
database connections being locked, thus quickly exhausting connection pool.
In general, an application with Web GUI is NOT a good candidate for
implementation using ResultIterators."

Thanks
Davide

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message