db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Øystein Grøvlen <Oystein.Grov...@Sun.COM>
Subject Re: Iterating through large result set in network mode causes OutOfMemoryException
Date Tue, 12 Feb 2008 14:36:42 GMT
Strange.  Seems like a bug.  If you could make a bug report with code to 
reproduce this would be great.

To your example code: I do not see anything that actually fetches the
content of the Blob, but I guess that is just an omission in the email.

I think in order for memory to be released when calling free(),
DERBY-3354 needs to be fixed.  I have just uploaded a patch for
DERBY-3354 for people to try out.

With respect to your original repro, my patch for DERBY-2892 should
fix that problem.  (The DERBY-2892 patch have not been applied to the
development branch yet since I think more test cases need to be
written) However, I needed to slightly modify your earlier repro to
get it to work.  It seems my DERBY-2892 patch only works if you
actually read the Blobs.  Thank you for pinpointing that.  If I added
a call to ResultSet#getBytes in the loop, the repro did not fail when
I ran with the two patches I mentioned above.

In other words, I think we should be able to solve your problems in
the next Derby release.  In the meantime, you may consider building
your own version which include the above mentioned patches.

HTH,

--
Øystein


Briggs wrote:
 > Yes, I meant the server.  Sorry.  But, it's my application that causes
 > it (the server) to fail.
 >
 >> The app?  Your first posting indicated that the server ran out of 
memory.
 >
 >>> Also, the occasional commit is also causing a problem.  I end up
 >>> eventually getting this:
 >>>
 >>> java.sql.SQLException: The data in this BLOB or CLOB is no longer
 >>> available.  The BLOB/CLOB's transaction may be committed, or its
 >>> connection is closed.
 >
 > Well, I am only calling commit after I have read the blob and pulled
 > all data from it. So, the strange issue is that I can get back a bunch
 > of records (and I am calling commit() after I retrieve all the data
 > from the current curror) then call ResultSet#next(). So, I am getting
 > through a bunch, but eventually the SQLException pops up.  I know it's
 > not good for performance to call commit() after every result, but I am
 > just testing.
 >
 > So, I basically test this like:
 >
 > while(rs.next()) {
 >      final String content;
 >      content = blobToString(rs.getBlob("content"));
 >      conn.commit();
 > }
 >
 > //very rudimentary method to convert to a string
 > private String blobToString(final Blob blob) throws SQLException{
 >    final byte[] bytes;
 >    final String result;
 >
 >     //not worried about the length at the moment, we don't store data 
that
 >     //large in our application, max is 64K
 >    bytes = new byte[(int)blob.length()];
 >    result = new String(bytes);
 >
 >    blob.free();
 >
 >    return result;
 > }
 >
 > That's the jist.  Still trying to figure out why that didn't work.
 >
 > Again, thanks for all your time.  Great list of people here!
 >
 >> A Blob/Clob object is not valid after the transaction it was created in
 >> has committed.
 >>
 >> --
 >> Øystein
 >>
 >
 >
 >


-- 
Øystein Grøvlen, Senior Staff Engineer
Architectural Lead, Java DB
Sun Microsystems, Database Technology Group
Trondheim, Norway
Subject:   Re: Iterating through large result set in network mode causes 
OutOfMemoryException
To:        Derby Discussion <derby-user@db.apache.org>
Cc:
Bcc:       oystein.grovlen@sun.com

Mime
View raw message