openjpa-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kevin Sutter <>
Subject Re: OpenJPA Fetch plan setFetchBatchSize gives out of memory error when used with Oracle 9i
Date Mon, 16 Nov 2009 15:45:39 GMT
Is this a trick question?  :-)  By setting the FetchBatchSize to MAX_VALUE,
then you are asking OpenJPA to traverse all relationships and instantiate
all entities that are touched.  If your object graph is involved, this could
turn out to be a rather large result set and, depending on the memory
available, you could easily exceed your memory and get an OOM exception.  By
your experimentation, you have already figured out that specifying a
concrete value works just fine.  So, I'm just confused as to what the real
question is...

More information on FetchBatchSize can be found here:


On Mon, Nov 16, 2009 at 5:09 AM, dileep55 <> wrote:

> Hi,
> Environment:
> Apache ODE : 1.3.4
> Open JPA : 1.3.0 snapshot
> DB : oracle 9i
> In BpelDAOConnectionImpl of Apache ODE, instanceQuery(instanceFilter),
> when the setFetchBatchSize is Integer.MAX_VALUE, it throws an out of memory
> exception.
> However when the batch size is set to some value like 0/100,the results are
> fetched correctly without any issues.
> Does anyone have nay idea about this kind of behaviour.
> Thanks
> Dileep
> --
> View this message in context:
> Sent from the OpenJPA Users mailing list archive at

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message