hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Allen Wittenauer <awittena...@linkedin.com>
Subject Re: DisallowedDatanodeException
Date Wed, 08 Sep 2010 21:39:09 GMT

On Sep 8, 2010, at 10:00 AM, Harsh J wrote:

> Hosts file or the slaves file? A valid datanode must be in the slaves
> file. Alternatively you can see if they are 'triggered' to start by
> start-dfs.sh or not.

No it doesn't.

The slaves file is only used by the start commands.

The hosts file is the proper place for it.

Chances are good, we have a DNS issue:

>>>> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>> org.apache.hadoop.ipc.RemoteException:
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: dev01:50010

Note that this is unqualified.  Yet:

>>>>    <value>hdfs://dev05.mynet.corp:54310</value>

This is qualified.

What form does your dfs.include file take and what is the output of the hostname command?

View raw message