hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brahma Reddy Battula <brahmareddy.batt...@huawei.com>
Subject checksum error
Date Mon, 30 Apr 2012 07:07:47 GMT

I have started hadoop cluster with one NameNode And one DataNode and written one file with
replication factor one.

 Now Edited((To get check-sum error)) written file in DN where block is located(physically).

Then I tried to read file ,then I got could not obtain block since block got corrupted and
check-sum error from DN logs.

After that I removed what ever I edited from block .

Then I try to read file,here I am able to read by using fsshell commands even though block
got corrupted(by fsck report)..But I am getting eof execption using readfully api.

After reverting not getting any checksum error from DN logs.

Please let me know the behavior once revert back to original

Thanks And Regards

Brahma Reddy

View raw message