hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jason Venner <jason.had...@gmail.com>
Subject Re: dfs fail to Unable to create new block
Date Tue, 28 Jul 2009 12:30:23 GMT
Looks like a possible communication failure with your Datanode, possibly out
of file descriptors or some networking issue? What version of hadoop are you
running?

2009-07-28 18:01:30,622 WARN org.apache.hadoop.hdfs.
>
> DFSClient: Could not get block locations. Source file
> "/data/segment/dat_4_8" - Aborting...
> 2009-07-28 18:01:30,635 WARN org.apache.hadoop.mapred.TaskTracker: Error
> running child



On Tue, Jul 28, 2009 at 3:16 AM, Jianmin Woo <jianmin_woo@yahoo.com> wrote:

> hi,
>
> I am encountering a problem when running a hadoop job with a relative large
> dataset(about 400M) in single-node hadoop environment.
> The error said that dfs fail to create new block. However, the size of
> physical disk is large enough. So is there any reason for this failure? any
> limitation for the size of disk space a job can occupy? The following is the
> snippet of exception stack.  Thanks for your attention.
>
> Regards,
> Jianmin
>
>
> 2009-07-28 18:00:31,757 INFO org.apache.hadoop.mapred.Merger: Merging 1
> sorted segments
> 2009-07-28 18:00:31,792 INFO org.apache.hadoop.mapred.Merger: Down to the
> last merge-pass, with 1 segments left of total size: 1272809137 bytes
> 2009-07-28 18:01:06,521 INFO org.apache.hadoop.hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.EOFException
> 2009-07-28 18:01:06,521 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning
> block blk_2149418359249628613_12378
> 2009-07-28 18:01:12,578 INFO org.apache.hadoop.hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.EOFException
> 2009-07-28 18:01:12,578 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning
> block blk_-4276450909968435375_12378
> 2009-07-28 18:01:18,581 INFO org.apache.hadoop.hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.EOFException
> 2009-07-28 18:01:18,581 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning
> block blk_1370666846409896923_12378
> 2009-07-28 18:01:24,584 INFO org.apache.hadoop.hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.EOFException
> 2009-07-28 18:01:24,584 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning
> block blk_-8117322104093252360_12378
> 2009-07-28 18:01:30,621 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer
> Exception: java.io.IOException: Unable to create new block.
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2781)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2046)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2232)
>
> 2009-07-28 18:01:30,622 WARN org.apache.hadoop.hdfs.DFSClient: Error
> Recovery for block blk_-8117322104093252360_12378 bad datanode[0] nodes ==
> null
> 2009-07-28 18:01:30,622 WARN org.apache.hadoop.hdfs.DFSClient: Could not
> get block locations. Source file "/data/segment/dat_4_8" - Aborting...
> 2009-07-28 18:01:30,635 WARN org.apache.hadoop.mapred.TaskTracker: Error
> running child
> java.io.EOFException
>        at java.io.DataInputStream.readByte(DataInputStream.java:250)
>        at
> org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
>        at
> org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
>        at org.apache.hadoop.io.Text.readString(Text.java:400)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2837)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2762)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2046)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2232)
>
>
> 2009-07-28 18:01:30,645 INFO org.apache.hadoop.mapred.TaskRunner: Runnning
> cleanup for the task
>
>
>
>




-- 
Pro Hadoop, a book to guide you from beginner to hadoop mastery,
http://www.amazon.com/dp/1430219424?tag=jewlerymall
www.prohadoopbook.com a community for Hadoop Professionals

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message