hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Satyam Singh <satyam.si...@ericsson.com>
Subject Re: Datanode not allowed to connect to the Namenode in Hadoop 2.3.0 cluster.
Date Tue, 05 Aug 2014 05:20:20 GMT
You have not given namenode uri in /etc/hosts file , thus it can't 
resolve it to ipaddress and your namenode would also be not started.
Preferable practice is to start your cluster through start-dfs.sh 
command, it implicitly starts first namenode and then all its datanodes.

Also make sure you have given ipaddress in salve file, if not then also 
make entry for hostnames in /etc/hosts file


BR,
Satyam

On 08/05/2014 12:21 AM, S.L wrote:
>
> The contents are
>
> 127.0.0.1   localhost localhost.localdomain localhost4 
> localhost4.localdomain4
> ::1         localhost localhost.localdomain localhost6 
> localhost6.localdomain6
>
>
>
> On Sun, Aug 3, 2014 at 11:21 PM, Ritesh Kumar Singh 
> <riteshoneinamillion@gmail.com <mailto:riteshoneinamillion@gmail.com>> 
> wrote:
>
>     check the contents of '/etc/hosts' file
>
>
>     On Mon, Aug 4, 2014 at 3:27 AM, S.L <simpleliving016@gmail.com
>     <mailto:simpleliving016@gmail.com>> wrote:
>
>         Hi All,
>
>         I am trying to set up a Apache Hadoop 2.3.0 cluster , I have a
>         master and three slave nodes , the slave nodes are listed in
>         the $HADOOP_HOME/etc/hadoop/slaves file and I can telnet from
>         the slaves to the Master Name node on port 9000, however when
>         I start the datanode on any of the slaves I get the following
>         exception .
>
>         2014-08-03 08:04:27,952 FATAL
>         org.apache.hadoop.hdfs.server.datanode.DataNode:
>         Initialization failed for block pool Block pool
>         BP-1086620743-170.75.152.162-1407064313305 (Datanode Uuid
>         null) service to server1.dealyaft.com/170.75.152.162:9000
>         <http://server1.dealyaft.com/170.75.152.162:9000>
>         org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>         Datanode denied communication with namenode because hostname
>         cannot be resolved .
>
>         The following are the contents of my core-site.xml.
>
>         <configuration>
>             <property>
>                 <name>fs.default.name <http://fs.default.name></name>
>                 <value>hdfs://server1.mydomain.com:9000
>         <http://server1.mydomain.com:9000></value>
>             </property>
>         </configuration>
>
>         Also in my hdfs-site.xml  I am not setting any value for
>         dfs.hosts or dfs.hosts.exclude properties.
>
>         Thanks.
>
>
>


Mime
View raw message