hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Boris Shkolnik (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-4999) IndexOutOfBoundsException in FSEditLog
Date Wed, 28 Jan 2009 00:58:59 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-4999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12667897#action_12667897
] 

Boris Shkolnik commented on HADOOP-4999:
----------------------------------------

ran test-patch
ANT_HOME=/home/hadoopqa/tools/ant/apache-ant-1.7.1 ant -Dpatch.file=../patches/HADOOP-4999-1.patch
-Dfindbugs.home=/home/ndaley/tools/findbugs/latest -Dforrest.home=/home/ndaley/tools/forrest/latest
-Djava5.home=/usr/releng/tools/java/jdk1.5.0_06 -Dscratch.dir=../scratch_dir/ test-patch

results:
     [exec] -1 overall.  
     [exec] 
     [exec]     +1 @author.  The patch does not contain any @author tags.
     [exec] 
     [exec]     -1 tests included.  The patch doesn't appear to include any new or modified
tests.
     [exec]                         Please justify why no tests are needed for this patch.
     [exec] 
     [exec]     +1 javadoc.  The javadoc tool did not generate any warning messages.
     [exec] 
     [exec]     +1 javac.  The applied patch does not increase the total number of javac compiler
warnings.
     [exec] 
     [exec]     +1 findbugs.  The patch does not introduce any new Findbugs warnings.
     [exec] 
     [exec]     +1 Eclipse classpath. The patch retains Eclipse classpath integrity.
     [exec] 
     [exec] 
     [exec] 


No test is included, because it is a failure case and I tested it manually.

> IndexOutOfBoundsException in FSEditLog
> --------------------------------------
>
>                 Key: HADOOP-4999
>                 URL: https://issues.apache.org/jira/browse/HADOOP-4999
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: dfs
>            Reporter: Boris Shkolnik
>            Assignee: Boris Shkolnik
>             Fix For: 0.20.0
>
>         Attachments: HADOOP-4999-1.patch, HADOOP-4999.patch
>
>
> when we go over a collection of editStreams in FSEditLog::logEdit we pre-calculate number
of iterations for the "for loop":
> int numEditStreams = editStreams.size();
> for (int idx = 0; idx < numEditStreams(); idx++) {
> ...
> processIOError(idx);
> ...
> }
> but there is a possibility of an IOError that will call processIOError(idx) which will
remove an editStream from editStreams inside the loop, and that will cause IndexOutOfBoundsException
when end of collection is reached.
> proposed fix: recalculate size of the collection on every iteration (it is very cheap,
cause it just returns an integer).

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message