hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Nauroth <cnaur...@hortonworks.com>
Subject Re: hadoop datanode read or write block error
Date Wed, 16 Dec 2015 20:53:38 GMT
Hello Evan,

It looks like there are two separate problems.

The errors that mention "disk usage" indicate that the DataNode forked to call "du -sk", but
the command didn't return any output.  I recommend checking that this machine has du available
on the path, and the du command is working as expected.

The error about the WRITE_BLOCK operation indicates a failure to fully deserialize a client
request to write a block.  Interestingly, it was able to deserialize enough of the message
to identify it as a WRITE_BLOCK operation, but then it failed later parsing the remaining
payload.  Is it possible that the client that is trying to perform the write is running a
very old version of the HDFS client code?

--Chris Nauroth

From: yaoxiaohua <yaoxiaohua@outlook.com<mailto:yaoxiaohua@outlook.com>>
Date: Tuesday, December 15, 2015 at 11:16 PM
To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: hadoop datanode read or write block error

Hi ,
                This is my first email to ask question for hadoop.
Hadoop version hadoop2.3
Jdk : ibm jdk 1.7
Issues:
I found a lot of error in the data node process log like this:
2015-12-16 14:54:28,438 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-426197605-10.19.206.101-1406809206259:blk_1251972645_178255413
src: /172.19.206.118:58220 dest: /172.19.206.142:50011
2015-12-16 14:54:31,898 WARN org.apache.hadoop.util.Shell: Could not get disk usage information
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
2015-12-16 14:54:32,189 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException
in offerService
java.io.IOException: Expecting a line not the end of stream
        at org.apache.hadoop.fs.DU.parseExecResult(DU.java:233)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.fs.DU.run(DU.java:190)
        at org.apache.hadoop.fs.DU$DURefreshThread.run(DU.java:119)
        at java.lang.Thread.run(Thread.java:809)
ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: px42pub:50011:DataXceiver error processing
WRITE_BLOCK operation
 src: /172.19.206.138:47717 dest: /172.19.206.142:50011
java.lang.IllegalStateException: Current state = RESET, new state = FLUSHED
        at java.nio.charset.CharsetEncoder.throwIllegalStateException(CharsetEncoder.java:968)
        at java.nio.charset.CharsetEncoder.flush(CharsetEncoder.java:657)
        at java.nio.charset.CharsetEncoder.encode(CharsetEncoder.java:786)
        at org.apache.hadoop.io.Text.encode(Text.java:443)
        at org.apache.hadoop.io.Text.set(Text.java:198)
        at org.apache.hadoop.io.Text.<init>(Text.java:88)
        at org.apache.hadoop.hdfs.protocolPB.PBHelper.convert(PBHelper.java:714)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)
        at java.lang.Thread.run(Thread.java:809)

could you give me any suggestion for this?

Thanks.

Best Regards,
Evan Yao


Mime
View raw message