hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <qwertyman...@gmail.com>
Subject Re: I got errors from hdfs about DataStreamer Exceptions.
Date Fri, 18 Feb 2011 02:43:45 GMT
You may want to check your HDFS health stat via 'fsck'
(http://namenode/fsck or `hadoop fsck`). There may be a few corrupt
files or bad DNs.

Would also be good to know what exact version of Hadoop you're running.

On Fri, Feb 18, 2011 at 7:59 AM, Jun Young Kim <juneng603@gmail.com> wrote:
> hi, all.
>
> I got errors from hdfs.
>
> 2011-02-18 11:21:29[WARN ][DFSOutputStream.java]run()(519) : DataStreamer
> Exception: java.io.IOException: Unable to create new block.
>        at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:832)
>        at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427)
>
> 2011-02-18 11:21:29[WARN
> ][DFSOutputStream.java]setupPipelineForAppendOrRecovery()(730) : Could not
> get block locations. Source file
> "/user/test/51/output/ehshop00newsvc-r-00000" - Aborting...
> 2011-02-18 11:21:29[WARN ][Child.java]main()(234) : Exception running child
> : java.io.EOFException
>        at java.io.DataInputStream.readShort(DataInputStream.java:298)
>        at
> org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Status.read(DataTransferProtocol.java:113)
>        at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:881)
>        at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:820)
>        at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427)
>
> 2011-02-18 11:21:29[INFO ][Task.java]taskCleanup()(996) : Runnning cleanup
> for the task
>
>
>
> I think this one is also not different error.
>
> org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
> blk_-2325764274016776017_8292 file=/user/test/51/input/kids.txt
>
>        at
> org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:559)
>
>        at
> org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:367)
>
>        at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:514)
>
>        at java.io.DataInputStream.read(DataInputStream.java:83)
>
>        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:138)
>
>        at
> org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:149)
>
>        at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:465)
>
>        at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
>
>        at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:90)
>
>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
>
>
> --> I've checked the file '/user/test/51/input/kids.txt ', but, there is not
> strange ones. this file is healthy.
>
> Does anybody know about this error?
> How could I fix this one?
>
> thanks.
>
> --
> Junyoung Kim (juneng603@gmail.com)
>
>



-- 
Harsh J
www.harshj.com

Mime
View raw message