Hi All,
  Is there something special/specific I have to do after writing a large binary stream to my database in order to reclaim memory resources?

  My use-case is:
  1.  Create a new row containing a name, content and content size.  (500MB file)
  2.  Commit via the connection.  (autocommit is turned off)

  What I'm noticing is that the memory usage jumps from 1.2 MB (after the driver is loaded) to 41.9MB after the insertion.  After numerous manual GCs the memory remains at 34MB.

  I've made not attempt to tune page/cache sizes; is this what I'm missing?

I'm using:
  Derby 10.2.2.0
  Java 1.6.0-b105
  Ubuntu 6.06.1 LTS

--
--------------------------------------------------------------------------------
Raymond Kroeker
thinkParity Solutions Inc.