hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From xiao yang <yangxiao9...@gmail.com>
Subject Error when putting too large folder into Hadoop
Date Fri, 20 Nov 2009 06:20:14 GMT
Hi, all

I'm using hadoop-0.19.1, which is included in nutch 1.0. The HDFS is
deployed on 12 nodes, and they work fine.
Now I want to put a folder which is about 100GB large into hadoop.
Here is the command:
bin/hadoop fs -put /srcfolder /pub
When the source folder is less than 10GB, everything is OK. But if the
folder is too large (How large I haven't figured out yet). It will
fail in the process.
What's the problem? How can I fix it?

Following are the error messages on the screen:


2009-11-20 11:31:09,170 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.140:50010
2009-11-20 11:31:09,282 INFO  DFSClient - Abandoning block
blk_4524254776889347494_31880
2009-11-20 11:31:15,298 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.141:50010
2009-11-20 11:31:15,298 INFO  DFSClient - Abandoning block
blk_-6096773602253771497_31884
2009-11-20 11:31:22,105 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.141:50010
2009-11-20 11:31:22,106 INFO  DFSClient - Abandoning block
blk_7476142191552180978_31895
2009-11-20 11:31:28,109 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.145:50010
2009-11-20 11:31:28,109 INFO  DFSClient - Abandoning block
blk_1702775959905886525_31903
2009-11-20 11:31:34,114 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Could not read from
stream
2009-11-20 11:31:34,114 INFO  DFSClient - Abandoning block
blk_-6529348247759206930_31906
2009-11-20 11:31:47,507 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.144:50010
2009-11-20 11:31:47,508 INFO  DFSClient - Abandoning block
blk_8773438940339692415_32051
2009-11-20 11:31:53,511 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.139:50010
2009-11-20 11:31:53,511 INFO  DFSClient - Abandoning block
blk_-2934984271889679562_32063
2009-11-20 11:31:59,515 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.145:50010
2009-11-20 11:31:59,515 INFO  DFSClient - Abandoning block
blk_-9114974684513095559_32065
2009-11-20 11:32:05,519 INFO  DFSClient - Exception in
createBlockOutputStream java.io.IOException: Bad connect ack with
firstBadLink 10.214.10.145:50010
2009-11-20 11:32:05,519 INFO  DFSClient - Abandoning block
blk_8604459663942116813_32071
2009-11-20 11:32:11,580 WARN  DFSClient - DataStreamer Exception:
java.io.IOException: Unable to create new block.
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2722)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:1996)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2183)

2009-11-20 11:32:11,580 WARN  DFSClient - Error Recovery for block
blk_8604459663942116813_32071 bad datanode[1] nodes == null
2009-11-20 11:32:11,580 WARN  DFSClient - Could not get block
locations. Source file "/pub/comic/_Incoming_/1.wma" - Aborting...
put: Bad connect ack with firstBadLink 10.214.10.145:50010

Mime
View raw message