hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jun Young Kim <juneng...@gmail.com>
Subject Re: I got errors from hdfs about DataStreamer Exceptions.
Date Fri, 18 Feb 2011 02:51:07 GMT
hi, harsh.
you're always giving a response very quickly. ;)

I am using a version 0.21.0 now.
before asking about this problem, I've checked already file system healthy.

$> hadoop fsck /
.....
.....
Status: HEALTHY
  Total size:    24231595038 B
  Total dirs:    43818
  Total files:   41193 (Files currently being written: 2178)
  Total blocks (validated):      40941 (avg. block size 591866 B) (Total 
open file blocks (not validated): 224)
  Minimally replicated blocks:   40941 (100.0 %)
  Over-replicated blocks:        1 (0.0024425392 %)
  Under-replicated blocks:       2 (0.0048850784 %)
  Mis-replicated blocks:         0 (0.0 %)
  Default replication factor:    2
  Average block replication:     2.1106226
  Corrupt blocks:                0
  Missing replicas:              4 (0.00462904 %)
  Number of data-nodes:          8
  Number of racks:               1

The filesystem under path '/' is HEALTHY

additionally, I found a little different error. here it is.

java.io.IOException: Bad connect ack with firstBadLink as 
10.25.241.107:50010 at 
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:889)

at 
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:820)

at 
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427)


here is my execution environment.

average job count : 20
max map capacity : 128
max reduce capacity : 128
avg/slot per node : 32

avg input file size per job : 200M ~ 1G

thanks.

Junyoung Kim (juneng603@gmail.com)


On 02/18/2011 11:43 AM, Harsh J wrote:
> You may want to check your HDFS health stat via 'fsck'
> (http://namenode/fsck or `hadoop fsck`). There may be a few corrupt
> files or bad DNs.
>
> Would also be good to know what exact version of Hadoop you're running.
>
> On Fri, Feb 18, 2011 at 7:59 AM, Jun Young Kim<juneng603@gmail.com>  wrote:
>> hi, all.
>>
>> I got errors from hdfs.
>>
>> 2011-02-18 11:21:29[WARN ][DFSOutputStream.java]run()(519) : DataStreamer
>> Exception: java.io.IOException: Unable to create new block.
>>         at
>> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:832)
>>         at
>> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427)
>>
>> 2011-02-18 11:21:29[WARN
>> ][DFSOutputStream.java]setupPipelineForAppendOrRecovery()(730) : Could not
>> get block locations. Source file
>> "/user/test/51/output/ehshop00newsvc-r-00000" - Aborting...
>> 2011-02-18 11:21:29[WARN ][Child.java]main()(234) : Exception running child
>> : java.io.EOFException
>>         at java.io.DataInputStream.readShort(DataInputStream.java:298)
>>         at
>> org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Status.read(DataTransferProtocol.java:113)
>>         at
>> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:881)
>>         at
>> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:820)
>>         at
>> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427)
>>
>> 2011-02-18 11:21:29[INFO ][Task.java]taskCleanup()(996) : Runnning cleanup
>> for the task
>>
>>
>>
>> I think this one is also not different error.
>>
>> org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
>> blk_-2325764274016776017_8292 file=/user/test/51/input/kids.txt
>>
>>         at
>> org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:559)
>>
>>         at
>> org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:367)
>>
>>         at
>> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:514)
>>
>>         at java.io.DataInputStream.read(DataInputStream.java:83)
>>
>>         at org.apache.hadoop.util.LineReader.readLine(LineReader.java:138)
>>
>>         at
>> org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:149)
>>
>>         at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:465)
>>
>>         at
>> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
>>
>>         at
>> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:90)
>>
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
>>
>>
>> -->  I've checked the file '/user/test/51/input/kids.txt ', but, there is not
>> strange ones. this file is healthy.
>>
>> Does anybody know about this error?
>> How could I fix this one?
>>
>> thanks.
>>
>> --
>> Junyoung Kim (juneng603@gmail.com)
>>
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message