hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexander Alten-Lorenz <wget.n...@gmail.com>
Subject Re: Failed to start datanode due to bind exception
Date Thu, 12 Feb 2015 12:59:48 GMT
/var/run/hdfs-sockets has to be the right permissions. Per default 755 hdfs:hdfs

BR,
 Alexander 

> On 10 Feb 2015, at 19:39, Rajesh Thallam <rajesh.thallam@gmail.com> wrote:
> 
> There are no contents in the hdfs-sockets directory 
> Apache Hadoop Base version if 2.5.0 (using CDH 5.3.0)
> 
> On Tue, Feb 10, 2015 at 10:24 AM, Ted Yu <yuzhihong@gmail.com <mailto:yuzhihong@gmail.com>>
wrote:
> The exception came from DomainSocket so using netstat wouldn't reveal the conflict.
> 
> What's the output from:
> ls -l /var/run/hdfs-sockets/datanode
> 
> Which hadoop release are you using ?
> 
> Cheers
> 
> On Tue, Feb 10, 2015 at 10:12 AM, Rajesh Thallam <rajesh.thallam@gmail.com <mailto:rajesh.thallam@gmail.com>>
wrote:
> I have been repeatedly trying to start datanode but it fails with bind exception saying
address is already in use even though port is free
> 
> I used below commands to check
> 
> netstat -a -t --numeric-ports -p | grep 500
> 
>  
> I have overridden default port 50070 with 50081 but the issue still persists.
> 
> Starting DataNode with maxLockedMemory = 0
> Opened streaming server at /172.19.7.160:50081 <http://172.19.7.160:50081/>
> Balancing bandwith is 10485760 bytes/s
> Number threads for balancing is 5
> Waiting for threadgroup to exit, active threads is 0
> Shutdown complete.
> Exception in secureMain
> java.net.BindException: bind(2) error: Address already in use when trying to bind to
'/var/run/hdfs-sockets/datanode'
>     at org.apache.hadoop.net.unix.DomainSocket.bind0(Native Method)
>     at org.apache.hadoop.net.unix.DomainSocket.bindAndListen(DomainSocket.java:191)
>     at org.apache.hadoop.hdfs.net.DomainPeerServer.<init>(DomainPeerServer.java:40)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:907)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:873)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1066)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:411)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2297)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2184)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2231)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2407)
>     at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2431)
> Exiting with status 1
> 
> hdfs-site.xml
>   <property>
>     <name>dfs.datanode.address</name>
>     <value>hostname.dc.xx.org:50010 <http://hostname.dc.xx.org:50010/></value>
>   </property>
>   <property>
>     <name>dfs.datanode.ipc.address</name>
>     <value>hostname.dc.xx.org:50020 <http://hostname.dc.xx.org:50020/></value>
>   </property>
>   <property>
>     <name>dfs.datanode.http.address</name>
>     <value>hostname.dc.xx.org:50075 <http://hostname.dc.xx.org:50075/></value>
>   </property>
> Regards,
> RT
> 
> 
> 
> 
> -- 
> Cheers,
> RT


Mime
View raw message