hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "sathish (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-9114) After defined the dfs.checksum.type as the NULL, write file and hflush will through java.lang.ArrayIndexOutOfBoundsException
Date Fri, 22 Nov 2013 04:47:39 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13829646#comment-13829646
] 

sathish commented on HADOOP-9114:
---------------------------------

Thanks UMA for review and Commiting the patch

Acoording to your comments,test case is needed to test that dfs.check sumtype so i can add
that test case in HDFS project .

> After defined the dfs.checksum.type as the NULL, write file and hflush will through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-9114
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9114
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.0.1-alpha
>            Reporter: liuyang
>            Assignee: sathish
>            Priority: Minor
>             Fix For: 3.0.0, 2.3.0, 2.2.1
>
>         Attachments: FSOutputSummer.java.patch, HADOOP-9114-001.patch, HADOOP-9114-002.1.patch,
HADOOP-9114-002.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value can be defined
as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or CRC32, but the client will through
java.lang.ArrayIndexOutOfBoundsException when the value is configured NULL.



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Mime
View raw message