Hi all,

I have a database with three tables (Table_A, Table_B and Table_C). There is one identity column in Table_A which acts as a foreign key in Tables B and C with an ON DELETE CASCADE.

Here is the table structure ....

Table A : Col_x_ID , Coly_Name, ....
Table B : Col_p_ID, Col_q_Name, .... Col_x ... FOREIGN KEY(Col_x) REFERENCES Table A (Col_x) ON DELETE CASCADE)
Table B : Col_m_ID, Col_n_Name, .... Col_x ... FOREIGN KEY(Col_x) REFERENCES Table A (Col_x) ON DELETE CASCADE)

When I now try to delete a row in Table_A, I experience a spike in CPU usage to almost 100% and also the memory  usage bumps from 40MB to 100MB. This remains sustained for a few seconds (like 5 seconds) causing the UI to almost freeze in my application. Even worse, it causes an Out of Memory Exception!

I am pretty sure that something isn't quite right with what I am experiencing. I know row inserts and deletes are computationally expensive, but surely it has to be less expensive than what I am experiencing.

I'd appreciate if anyone can tell me how to optimise this or is there a more efficient method to delete entries with a cascade effect?

I am running Derby version 10.4.1.3 on MacOSX with JRE 1.5 version 6.

Thanks,
Jay


Get thousands of games on your PC, your mobile phone, and the web with Windows®. Game with Windows