hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From TS chia <the.ts.c...@gmail.com>
Subject bin/hadoop fs -copyFromLocal fails when 1 datanode download
Date Thu, 05 Jan 2012 16:23:39 GMT
Hi All,

I am new to Hadoop. I was able to 3 datanode running and working.
I purposefully shutdown one datanode and execute
"bin/hadoop fs -copyFromLocal ../hadoop.sh
/user/coka/somedir/slave02-datanodeDown" to see what happen.

The execution fails with the exception below.

Why it is so ?

Thanks in advance.

Cheers
TS


12/01/05 15:41:40 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as
192.160.1.103:50010
12/01/05 15:41:40 INFO hdfs.DFSClient: Abandoning block
blk_-4266327738363863152_1008
12/01/05 15:41:40 INFO hdfs.DFSClient: Waiting to find target node:
192.160.1.4:50010
12/01/05 15:41:49 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as
192.160.1.103:50010
12/01/05 15:41:49 INFO hdfs.DFSClient: Abandoning block
blk_1476878908693006667_1008
12/01/05 15:42:13 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as
192.160.1.103:50010
12/01/05 15:42:13 INFO hdfs.DFSClient: Abandoning block
blk_-7529382935054091141_1008
12/01/05 15:42:13 INFO hdfs.DFSClient: Waiting to find target node:
192.160.1.4:50010
12/01/05 15:42:22 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as
192.160.1.103:50010
12/01/05 15:42:22 INFO hdfs.DFSClient: Abandoning block
blk_375095740848321980_1008
12/01/05 15:42:28 WARN hdfs.DFSClient: DataStreamer Exception:
java.io.IOException: Unable to create new block.
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3002)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2255)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2446)

12/01/05 15:42:28 WARN hdfs.DFSClient: Error Recovery for block
blk_375095740848321980_1008 bad datanode[1] nodes == null
12/01/05 15:42:28 WARN hdfs.DFSClient: Could not get block locations.
Source file "/user/coka/somedir/slave02-datanodeDown" - Aborting...
copyFromLocal: Bad connect ack with firstBadLink as 192.160.1.103:50010
12/01/05 15:42:28 ERROR hdfs.DFSClient: Exception closing file
/user/coka/somedir/slave02-datanodeDown : java.io.IOException: Bad connect
ack with firstBadLink as 192.160.1.103:50010
java.io.IOException: Bad connect ack with firstBadLink as
192.160.1.103:50010
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:3068)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2983)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2255)
at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2446)

Mime
View raw message