ibatis-user-java mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rao, Satish" <satish....@fmr.com>
Subject RE: Best mechanism to read large datasets
Date Fri, 08 Jul 2005 14:18:08 GMT
I would definitely like that, but unfortunately, that is not an option.

	-----Original Message-----
	From: Darek Dober [mailto:dooverone@op.pl] 
	Sent: Friday, July 08, 2005 10:16 AM
	To: user-java@ibatis.apache.org
	Subject: Re: Best mechanism to read large datasets
	
	
	Can't you divide your result into pages with smaller number of
records ?
	 
	queryForList with proper parameters should do that.
	 
	Darek

		----- Original Message ----- 
		From: Rao, Satish <mailto:satish.rao@fmr.com>  
		To: user-java@ibatis.apache.org 
		Sent: Friday, July 08, 2005 4:10 PM
		Subject: Best mechanism to read large datasets


		I have a situation where I might be retrieving thousands
of rows from Oracle. I am using a queryForList() and I am assuming it
stores the rows in memory. As a result I am quickly running out of
memory.

		Is there a way for ibatis to read each row from table
and discard it as soon as it is processed, so I don't run into out of
memory issues?



Mime
View raw message