hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brahma Reddy Battula <brahmareddy.batt...@huawei.com>
Subject RE: Failed to start datanode due to bind exception
Date Thu, 12 Feb 2015 12:31:43 GMT
Hello Rajesh

I think, you might have configured "dfs.domain.socket.path" as /var/run/hdfs-sockets/datanode

Actually ,This is a path to a UNIX domain socket that will be used for communication between
the DataNode and local HDFS clients. If the string "_PORT" is present in this path, it will
be replaced by the TCP port of the DataNode.

Ideally if some port present only , you will get that error.please re-checkonce..

if you delete "/var/run/hdfs-sockets/datanode" (worst condition, if it is corrupted) and start
the datanode.

Thanks & Regards

 Brahma Reddy Battula

From: Rajesh Thallam [rajesh.thallam@gmail.com]
Sent: Wednesday, February 11, 2015 12:09 AM
To: user@hadoop.apache.org
Subject: Re: Failed to start datanode due to bind exception

There are no contents in the hdfs-sockets directory
Apache Hadoop Base version if 2.5.0 (using CDH 5.3.0)

On Tue, Feb 10, 2015 at 10:24 AM, Ted Yu <yuzhihong@gmail.com<mailto:yuzhihong@gmail.com>>
The exception came from DomainSocket so using netstat wouldn't reveal the conflict.

What's the output from:
ls -l /var/run/hdfs-sockets/datanode

Which hadoop release are you using ?


On Tue, Feb 10, 2015 at 10:12 AM, Rajesh Thallam <rajesh.thallam@gmail.com<mailto:rajesh.thallam@gmail.com>>

I have been repeatedly trying to start datanode but it fails with bind exception saying address
is already in use even though port is free

I used below commands to check

netstat -a -t --numeric-ports -p | grep 500

I have overridden default port 50070 with 50081 but the issue still persists.

Starting DataNode with maxLockedMemory = 0
Opened streaming server at /<>
Balancing bandwith is 10485760 bytes/s
Number threads for balancing is 5
Waiting for threadgroup to exit, active threads is 0
Shutdown complete.
Exception in secureMain
java.net.BindException: bind(2) error: Address already in use when trying to bind to '/var/run/hdfs-sockets/datanode'
    at org.apache.hadoop.net.unix.DomainSocket.bind0(Native Method)
    at org.apache.hadoop.net.unix.DomainSocket.bindAndListen(DomainSocket.java:191)
    at org.apache.hadoop.hdfs.net.DomainPeerServer.<init>(DomainPeerServer.java:40)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:907)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:873)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1066)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:411)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2297)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2184)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2231)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2407)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2431)
Exiting with status 1





View raw message