hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bogdan Raducanu (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HDFS-9220) Reading small file (< 512 bytes) that is open for append fails due to incorrect checksum
Date Tue, 13 Oct 2015 14:26:05 GMT

     [ https://issues.apache.org/jira/browse/HDFS-9220?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Bogdan Raducanu updated HDFS-9220:
----------------------------------
    Summary: Reading small file (< 512 bytes) that is open for append fails due to incorrect
checksum  (was: ChecksumException after writing less than 512 bytes)

> Reading small file (< 512 bytes) that is open for append fails due to incorrect checksum
> ----------------------------------------------------------------------------------------
>
>                 Key: HDFS-9220
>                 URL: https://issues.apache.org/jira/browse/HDFS-9220
>             Project: Hadoop HDFS
>          Issue Type: Bug
>    Affects Versions: 2.7.1
>            Reporter: Bogdan Raducanu
>            Assignee: Jagadesh Kiran N
>         Attachments: test2.java
>
>
> Exception:
> 2015-10-09 14:59:40 WARN  DFSClient:1150 - fetchBlockByteRange(). Got a checksum exception
for /tmp/file0.05355529331575182 at BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882:0
from DatanodeInfoWithStorage[10.10.10.10]:5001
> All 3 replicas cause this exception and the read fails entirely with:
> BlockMissingException: Could not obtain block: BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882
file=/tmp/file0.05355529331575182
> Code to reproduce is attached.
> Does not happen in 2.7.0.
> Data is read correctly if checksum verification is disabled.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message