ibatis-user-java mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Darek Dober" <doover...@op.pl>
Subject Re: Best mechanism to read large datasets
Date Fri, 08 Jul 2005 15:01:42 GMT
Best mechanism to read large datasetsCan you explain or anyone else, how it works or what is
the best practise? 
I think that statement is open, and for each row method public void handleRow (Object object);
 is invoked or something like that

I assume, that we want to process a large datasets, and do some operations.

Should't be better in that case to use stored procedure, and use i.e cursor in database to
handle this, if we have such opportunity?
It seems to be much faster, then using JDBC. We are working on large datasets.

I don't have time for excercise now, but for 500 000 records to process, using only stored
procedure, should be much faster.

What are your opinions?



Darek Dober
  ----- Original Message ----- 
  From: Michal Malecki 
  To: user-java@ibatis.apache.org 
  Sent: Friday, July 08, 2005 4:22 PM
  Subject: Re: Best mechanism to read large datasets


  Hello,
  I think queryWithRowHandler should do the trick :)

  Regards
  Michal Malecki
  ----- Original Message ----- 
    From: Rao, Satish 
    To: user-java@ibatis.apache.org 
    Sent: Friday, July 08, 2005 4:10 PM
    Subject: Best mechanism to read large datasets


    I have a situation where I might be retrieving thousands of rows from Oracle. I am using
a queryForList() and I am assuming it stores the rows in memory. As a result I am quickly
running out of memory.

    Is there a way for ibatis to read each row from table and discard it as soon as it is
processed, so I don't run into out of memory issues?



Mime
View raw message