hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harold Lim <rold...@yahoo.com>
Subject Re: Bad connect ack with firstBadLink
Date Tue, 17 Nov 2009 00:05:19 GMT
When I checked my data node logs, I have:

2009-11-16 18:58:19,884 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(XXX.XXX.XXX.XXX:50010,
storageID=DS-1473795645-152.3.144.235-50010-1258414119856, infoPort=50075, ipcPort=50020):DataXceiver
java.io.IOException: xceiverCount 257 exceeds the limit of concurrent xcievers 256

Also, when I do netstat I have a lot of tcp connections in CLOSE_WAIT state and it stays there
until I restart HDFS.



-Harold


--- On Mon, 11/16/09, Jason Venner <jason.hadoop@gmail.com> wrote:

> From: Jason Venner <jason.hadoop@gmail.com>
> Subject: Re: Bad connect ack with firstBadLink
> To: hdfs-user@hadoop.apache.org
> Date: Monday, November 16, 2009, 10:39 AM
> The common reason for that is something in
> your chain of apps has run out of file descriptors, usually
> your application, but on rare occasion the datanodes.
> 
> The first time I saw this, with the hadoop dfs command, 
> it only happened on one machine in our cluster, when the
> copyFrom included a wild card, and upping the per user file
> descriptor limit resolved it.
> 
> 
> On Mon, Nov 16, 2009 at 3:35 PM,
> Harold Lim <rold_50@yahoo.com>
> wrote:
> 
> Hi All,
> 
> 
> 
> I'm trying to copy files from my local filesystem to
> HDFS but after a while I'm getting a
> java.io.IOException. Any ideas why I'm getting this?
> 
> 
> 
> 
> 
> /mount/hadoop-core-0.22.0-dev/bin# ./hdfs dfs
> -copyFromLocal /mount/filestore /filestore
> 
> 
> 
> 09/11/15 00:36:59 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:36:59 INFO hdfs.DFSClient: Abandoning block
> blk_5186243442349315665_1490
> 
> 09/11/15 00:37:05 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:37:05 INFO hdfs.DFSClient: Abandoning block
> blk_-8119590188070910888_1490
> 
> 09/11/15 00:37:12 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:37:12 INFO hdfs.DFSClient: Abandoning block
> blk_8182324531276884825_1492
> 
> 09/11/15 00:37:18 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:37:18 INFO hdfs.DFSClient: Abandoning block
> blk_6572699021740215303_1492
> 
> 09/11/15 00:37:24 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.EOFException
> 
> 09/11/15 00:37:24 INFO hdfs.DFSClient: Abandoning block
> blk_-6796814980855591976_1492
> 
> 09/11/15 00:37:30 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:37:30 INFO hdfs.DFSClient: Abandoning block
> blk_-7019917111201756069_1494
> 
> 09/11/15 00:37:36 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:37:36 INFO hdfs.DFSClient: Abandoning block
> blk_6878833767189775663_1494
> 
> 09/11/15 00:37:42 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack
> with firstBadLink as XXXXX:50010
> 
> 09/11/15 00:37:42 INFO hdfs.DFSClient: Abandoning block
> blk_-272412162654992144_1494
> 
> 09/11/15 00:37:48 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.EOFException
> 
> 09/11/15 00:37:48 INFO hdfs.DFSClient: Abandoning block
> blk_1592504437802093135_1494
> 
> 09/11/15 00:37:54 WARN hdfs.DFSClient: DataStreamer
> Exception: java.io.IOException: Unable to create new block.
> 
> 
> 
>       at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSClient.java:3086)
> 
>       at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2667)
> 
> 
> 
> 09/11/15 00:37:54 WARN hdfs.DFSClient: Could not get block
> locations. Source file "/filestore/e1565t.jpg" -
> Aborting...
> 
> copyFromLocal: null
> 
> 09/11/15 00:37:54 ERROR hdfs.DFSClient: Exception closing
> file /filestore/e1565t.jpg : java.io.EOFException
> 
> java.io.EOFException
> 
>       at
> java.io.DataInputStream.readShort(DataInputStream.java:298)
> 
>       at
> org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Status.read(DataTransferProtocol.java:109)
> 
>       at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.createBlockOutputStream(DFSClient.java:3138)
> 
>       at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSClient.java:3066)
> 
> 
> 
>       at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2667)
> 
> 
> 
> 
> 
> -Harold
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> Pro Hadoop, a book to guide you from beginner to hadoop
> mastery,
> http://www.amazon.com/dp/1430219424?tag=jewlerymall
> 
> www.prohadoopbook.com
> a community for Hadoop Professionals
> 
> 


      

Mime
View raw message