db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Suresh Thalamati <suresh.thalam...@gmail.com>
Subject Re: Import data from large Oracle table
Date Wed, 14 Mar 2007 21:59:57 GMT
Sedillo, Derek (Mission Systems) wrote:
> Hello,
>  
> I have several large Oracle tables (2+ million records) which I need to 
> import data from to Derby.  Here is what I have tried so far:
>  
> 1.  I have dumped the data to comma separated flat files.
> 2.  Used the import table utility like this:
>     /CALL SYSCS_UTIL.SYSCS_IMPORT_TABLE 
> (null,'SSJ_CNT','SSJ_CNT.csv',null, null,null,0);/


Import will run slow, if the table has  triggers,  foreing key 
references or  if the table already has some data. These force the 
data to be logged , which makes import run slow.

If you have foreign keys/triggers you may want drop them and add them 
back after the data is imported.  If you do that import might run faster.



> 3.  After '4 hours' of running it appears to have frozen up.  There is 
> both a db.lck and dbex.lck file present which I will delete soon.

These files are not related to import. You should not be deleting 
them. They are used to prevent multiple jvm instances booting
the same database concurrenly.


>  
> Do you think that 2 million records is just too much for this utility to 
> handle?  Is there a better way to transfer data from another database 
> besides the IMPORT_TABLE utility?
>  

I have not tried with 2 million rows , IMPORT_TABLE is the fastest way 
to tranfer data from files into a derby database.



hope that helps
-suresh


Mime
View raw message