ibatis-user-java mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hycel Taylor <hyc...@gmail.com>
Subject Re: Doing Batch Processing with IBatis from DAO Manager
Date Tue, 06 Sep 2005 18:38:04 GMT
Hi Clinton,

 I must ask of you to excuse the verbosity of this email. But, I need to 
gain a better understanding about how I might go about building a custom 
batch API so that I may get IBatis DAO's to properly do batching. In our 
last conversion, you explained the following:

 "The batch control DAO is just a convenient, reusable DAO that provides 
access to the batch API from the SQL Mapping framework, or your own custom 
JDBC batching code."

 So, I understand that I need to create a custom batch API. The question is 
how I should go about interfacing with IBatis SQL statements and do this 
transparent enough that all DAO's simply work when I turn batching on.

 Before using IBatis, I had my own custom DAO API. The way I handled 
batching was to set a Boolean flag, doBatching, to true. When doBatching was 
true, I would create a single Statement from the given connection and 
intercept every insert, update or delete command and add each command the 
statement with an statement.addBatch(). Later, I could execute all 
statements with a statement.executeBatch(). I'm supposing that in my custom 
API, I will need to do something similar with IBatis sql statements. I don't 
know for sure. I need help here. 

 Is this the best to go about solving this problem with IBatis? If so, how 
might I go about intercepting IBatis sql statements? Is a there and 
interface or one or more methods that I can use to communicate with IBatis 
sqlMaps to accomplish this task? Or am I looking at this all wrong and 
should consider a different approach?

 I also admit to being a little confused about all of this because of 
statements made on page 41 of the iBATIS-SqlMmaps-2 Developer Guide, which 
states the following:

-----------------------------------------------------------

"If you have great number of non-query (insert/update/delete) statements to 
execute, you might like to execute them as a batch to minimize network 
traffic and allow the JDBC driver to perform additional optimization (e.g. 
compression). Using batch is simple with SQLMap API, two simple methods 
allow you demarcate the boundaries of the batch: 

 sqlMap.startBatch();

//… execute statements between

sqlMap.executeBatch();

 Upon calling executeBatch(), all batched statements will execute through 
the JDBC driver."

-----------------------------------------------------------

 In my understanding of the above explanation, I created the following DAO:

 public class SqlMapBatchControllerDao extends SqlMapBaseDao implements 
BatchControllerDao {

 public SqlMapBatchControllerDao(DaoManager daoManager) {

super(daoManager);

}

 public void startBatch() {

startBatch();

try {

getSqlMapExecutor().startBatch();

} catch (SQLException e) {

throw new DaoException("Unable to startBatch", e);

}

}

 public int executeBatch() {

try {

return getSqlMapExecutor().executeBatch();

} catch (SQLException e) {

throw new DaoException("Unable to startBatch", e);

}

}

}

 I have a service class that uses this SqlMapBatchControllerDAO to start 
batching. All of my Service classes share the same transaction because all 
Service classes extend ServiceBase which uses a transaction manager (the one 
the Brian Barnett shared, using a ThreadLocal (Thanks Brian!)). So, in my 
migration class I have created to methods for starting and ending batches:

 protected startBatch() {

serviceBase.startTransaction();

batchControllerService.startBatch();

}

 protected executeBatch() {

batchControllerService.executeBatch();

serviceBase.commintTransaction();

serviceBase.endTransaction();

}

 In my migration class, I have a method, called migrate(), that performs the 
task of migration the data from an old database to a new database:

 public void migrate() {

System.out.println("Migrating exams...");

try {

startBatch();

 final ResultSet exams = getExams();

while (exams.next()) {

batchCounter++;

setBaseExamInfomation(exams, exam);

 final CampusCourseVO campusCourse = 
insertCampusCourse(exam,PUBLIC_COURSE_ID);

 if (batchCounter == BATCH_COUNT) {

executeBatch();

 startBatch();

updateCounters();

}

}

executeBatch();

updateCounters();

} catch (Exception e) {

e.printStackTrace();

} finally {

serviceBase.endTransaction();

}

}

 It's in the method, insertCampusCourse(), where I have discovered that 
batching is not working properly. In the method insertCampusCourse, I want 
to check if the campusCourse already exists (where termId, campusId and 
courseId find a match). If it does not exist, I want to post it. If it does 
exist, I want to return it in a CampusCourseVO value object. In batch mode 
this does work properly because even if I the campusCourse for the given 
termId, campusId and courseId was initially not found and thus inserted to 
the database, the next time the same termId, campusId, and courseId are 
attained from the resultSet and sent to the insertCampusCourse() method, the 
previously inserted record is not found. It is thus inserted again. Upon 
running batchControllerService.executeBatch(), a duplicate key exception is 
rightfully thrown. If I simply comment out the 
batchControllerService.startBatch() and batchControllerService.executeBatch() 
statements so that records are posting within a transaction without 
batching, a match to a given termId, campusId and courseId will be found. 
Thus, only one row for a given termId, campusId and courseId will be found 
(See code below).

 public CampusCourseVO insertCampusCourse(BaseExam exam, String 
protectedCourseId) throws Exception {

CampusCourseVO campusCourse = new CampusCourseVO();

 final TermVO term = new TermVO();

CampusVO campus = new CampusVO();

CourseVO course = new CourseVO();

 term.setTermId(DEFUALT_SYS_TERM_ID);

campus = campusService.findByProtectedId(exam.getCampus());

course = courseService.findByProtectedId(protectedCourseId);

 campusCourse.setTerm(term);

campusCourse.setCampus(campus);

campusCourse.setCourse(course);

 try {

campusCourse = campusCourseService.findByTermCampusCourse(campusCourse);

} catch (RequestNotFoundException e) { 

 campusCourseService.insert(campusCourse);

}

return campusCourse;

}

 So, again, I apologize for the length of this email. I may have two related 
questions in this email, both concerning how to properly do batching. On one 
end, it may be possible to successfully do batching with sqlMaps and I am 
just not doing it properly. On the other end, I may need to actually create 
my own batch API and need more instructions on how best to implement it. I'm 
just not sure, and would appreciate any advice in this matter so that I may 
accomplish the task of batching, using IBatis DAO's, as soon as possible.

 Thanks. 


On 8/13/05, Clinton Begin <clinton.begin@gmail.com> wrote: 
> 
> If you're in the same transaction, and all of your DAOs are using the same 
> database, there's no need to start more than one batch (and in fact it could 
> be causing significant problems for you).
> 
> The batch control DAO is just a convenient, reusable DAO that provides 
> access to the batch API from the SQL Mapping framework, or your own custom 
> JDBC batching code.
> 
> Clinton 
> 
> On 8/13/05, Hycel Taylor <hycel1@gmail.com> wrote: 
> > 
> > Interesting. I'm not sure what, batchControlDao, is doing here.
> > Currently, what I've just learned from others on this forum and what
> > I'm currently doing is the following:
> > 
> > daoManager.startTransaction();
> > 
> > schoolDao.startBatch();
> > majorDao.startBatch();
> > courseDao.startBatch();
> > ...
> > 
> > // do work here.
> > 
> > schoolDao.executeBatch();
> > majorDao.executeBatch();
> > courseDao.executeBatch();
> > ...
> > 
> > daoManager.commitTransaction ();
> > daoManager.endTransaction();
> > 
> > This seems to be working. Although, when I am batching this way, an
> > awful lot of locks are made on the database. This wasn't happening
> > when I was doing strait JDBC. So, I'm not sure if I'm using your DAO 
> > API with batching properly.
> > 
> > What do you suggest?
> > 
> 
>
Mime
View raw message