openjpa-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Patrick Linskey" <>
Subject Re: flush(): memory & performanc
Date Fri, 14 Sep 2007 01:23:28 GMT

What happens if you call evict() on the objects that should be made hollow?


On 9/13/07, Christiaan <> wrote:
> Hi,
> I am working on the following scenario: I need to import a large file and
> create an object (tree-like) structure. This process is pretty much
> streambased, so once I have create parts of the tree I don't need to access
> it anymore. However, if something fails, the complete creation needs to be
> rolled back.
> I am trying to do this by using flush(). As mentioned in another thread this
> results in a big performance hit (created an issue for this).  However, I
> also notice that fields of dirty objects are not cleared on flush() so they
> cannot be garbage collected (I have set retainValues to false). Since it is
> a tree-like structure I also keep a reference to the root node during the
> complete process. In such a scenario this easily leads to OutOfMemory
> exceptions. When I do intermediate commits, I do get constant performance
> and memory usage, since due to the retainValues=false, already processed
> nodes transition to hollow. I was under the impression that flush() should
> typically be used in this scenario or am I mistaken? Would it be possible if
> retainValues is set to false to transition dirty objects to hollow when
> flush() is being called, just like commit()? I know the jdo/jpa spec does
> not say anything about this situation for flush(), but it would seem like a
> logical consequence for me. For this scenario it would solve the memory
> problem.
> Kind regards,
> Christiaan
> --
> View this message in context:
> Sent from the OpenJPA Users mailing list archive at

Patrick Linskey
202 669 5907

View raw message