db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sedillo, Derek \(Mission Systems\)" <Derek.Sedi...@ngc.com>
Subject RE: Import data from large Oracle table
Date Wed, 14 Mar 2007 22:05:15 GMT

There are no foreign keys or triggers on the tables.  But there are
indexes which might be slowing things down.  So I might drop the indexes
and try running import again.  

Do you think that 2 million records are too many to use with the import



-----Original Message-----
From: Suresh Thalamati [mailto:suresh.thalamati@gmail.com] 
Sent: Wednesday, March 14, 2007 4:00 PM
To: Derby Discussion
Subject: Re: Import data from large Oracle table

Sedillo, Derek (Mission Systems) wrote:
> Hello,
> I have several large Oracle tables (2+ million records) which I need 
> to import data from to Derby.  Here is what I have tried so far:
> 1.  I have dumped the data to comma separated flat files.
> 2.  Used the import table utility like this:
> (null,'SSJ_CNT','SSJ_CNT.csv',null, null,null,0);/

Import will run slow, if the table has  triggers,  foreing key
references or  if the table already has some data. These force the data
to be logged , which makes import run slow.

If you have foreign keys/triggers you may want drop them and add them
back after the data is imported.  If you do that import might run

> 3.  After '4 hours' of running it appears to have frozen up.  There is

> both a db.lck and dbex.lck file present which I will delete soon.

These files are not related to import. You should not be deleting 
them. They are used to prevent multiple jvm instances booting
the same database concurrenly.

> Do you think that 2 million records is just too much for this utility
> handle?  Is there a better way to transfer data from another database 
> besides the IMPORT_TABLE utility?

I have not tried with 2 million rows , IMPORT_TABLE is the fastest way 
to tranfer data from files into a derby database.

hope that helps

View raw message