db-derby-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Van Couvering <David.Vancouver...@Sun.COM>
Subject Re: Bulk load of large data sets
Date Thu, 27 Jan 2005 18:08:04 GMT
Hi, Suresh.  I know I was going to get back to you, but I have been 
pulled onto other things.  I appreciate your interest and desire to 
help.  Hopefully I can get back on this again at some point in the future.

David

Suresh Thalamati wrote:

>
> Hi  David,
>
> I briefly looked at the import in code org.apache.derby.impl.load 
> package, it does not seem to be designed to keep all the rows that
> are being imported in the memory. It  seems to read  one rows at a 
> time through BufferedReader() .
> I tested  importing 300 MB of data  into emp(id int, name char(100), 
> dept char(30)) table.  JVM memory usage is
> < 30 MB. Are you seeing any memory growth when you import smaller set 
> of rows like 300MB in your environment ?
>
> It is possible that you might be  hitting some wierd bug ; I doubt any 
> one would have tried this case before.
> I am interested to debug this problem if  you could provide me the 
> following information :
>
> 1) Schema of the table (indexes ..etc) ..
> 2) Few rows of the data that you are importing.
> 3) All the paremeter you are passing to the IMPORT procedure.
> 4) what JVM are you using ?
> 5) parameter that you are passing to jvm like heap size ..etc.
> 6) Are you doing this in Network Server ot embedded ?
> 7)Are you executing the import procedure through IJ or a JDBC program ?
>
>
> David Van Couvering wrote:
>
>> Hi, all.  I get a Java OutOfMemory error when I try to load a large
>> table into a derby database using the bulk load stored procedure
>> SYSCS_UTIL.SYSCS_IMPORT_TABLE.  It is a 5GB table. It appears that this
>> procedure tries to load the entire table into cache before it
>> checkpoints it?
>
> I don't believe that will be the case.  Page Cache will write the 
> pages and reuse them whenever it needs new pages.;
> it does not grow infinitely ,  default should be around 1000 pages.
>
>> Any advice would be much appreciated, I can't allocate
>> 5GB of memory to Java...
>>
>>  
>>
> It is possible to that 5GB import to use  more  memory than normal if 
> garbage collector is not able to catch up. But
> it shoud not require 5GB memory for sure.
>
>
>
> Thanks
> -suresh


Mime
View raw message