hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: issue about write append into hdfs "ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation "
Date Fri, 21 Feb 2014 04:04:43 GMT
Which hadoop release are you using ?

Cheers


On Thu, Feb 20, 2014 at 8:57 PM, ch huang <justlooks@gmail.com> wrote:

> hi,maillist:
>           i see the following info in my hdfs log ,and the block belong to
> the file which write by scribe ,i do not know why
> is there any limit in hdfs system ?
>
> 2014-02-21 10:33:30,235 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
> BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
> received exc
> eption java.io.IOException: Replica gen stamp < block genstamp,
> block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
> replica=ReplicaWaitingToBeRecov
> ered, blk_-8536558734938003208_3820986, RWR
>   getNumBytes()     = 35840
>   getBytesOnDisk()  = 35840
>   getVisibleLength()= -1
>   getVolume()       = /data/4/dn/current
>   getBlockFile()    =
> /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
>   unlinked=false
> 2014-02-21 10:33:30,235 WARN
> org.apache.hadoop.hdfs.server.datanode.DataNode:
> DatanodeRegistration(192.168.11.12,
> storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
> ort=50075, ipcPort=50020,
> storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
> exception while serving BP-1043055049-192.168.11.11-1382442676
> 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
> java.io.IOException: Replica gen stamp < block genstamp,
> block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
> replica=ReplicaWaitingToBeRecovered, b
> lk_-8536558734938003208_3820986, RWR
>   getNumBytes()     = 35840
>   getBytesOnDisk()  = 35840
>   getVisibleLength()= -1
>   getVolume()       = /data/4/dn/current
>   getBlockFile()    =
> /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
>   unlinked=false
>         at
> org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:205)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
>         at java.lang.Thread.run(Thread.java:744)
> 2014-02-21 10:33:30,236 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
> error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
> 192.168.11.12:50010
> java.io.IOException: Replica gen stamp < block genstamp,
> block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
> replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
>   getNumBytes()     = 35840
>   getBytesOnDisk()  = 35840
>   getVisibleLength()= -1
>   getVolume()       = /data/4/dn/current
>   getBlockFile()    =
> /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
>   unlinked=false
>         at
> org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:205)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
>         at java.lang.Thread.run(Thread.java:744)
>

Mime
View raw message