hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Phulari <rphul...@yahoo-inc.com>
Subject Re: Cluster Setup Issues : Datanode not being initialized.
Date Thu, 04 Jun 2009 20:17:59 GMT
>From logs looks like your Hadoop cluster is facing two different issues .

At Slave

 1.  exception: java.net.NoRouteToHostException: No route to host in your logs

Diagnosis - One of your nodes cannot be reached correctly. Make sure you can ssh to your master
and slave and passwordless ssh keys are set .

At master
   2. java.io.IOException: Incompatible namespaceIDs in
Diagnosis - Your Hadoop namespaceID became corrupted. Unfortunately the easiest thing to do
reformat the HDFS

As you have not configured hadoop.tmp.dir  or data.dir by default Hadoop will you /tmp as
directory for temporary files , log files and for data dir which is not good practice .
I would suggest you using some tmpForHadoop dir  somewhere else than /tmp .

-
Ravi

On 6/4/09 12:39 PM, "asif md" <asif.d2d3@gmail.com> wrote:

Hello all,

I'm trying to setup a two node cluster < remote > using the following
tutorials
{ NOTE : i'm ignoring the tmp directory property in hadoop-site.xml
suggested by Michael }

Running Hadoop On Ubuntu Linux (Single-Node Cluster) - Michael G.
Noll<http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29>
Running Hadoop On Ubuntu Linux (Multi-Node Cluster) - Michael G.
Noll<http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Multi-Node_Cluster%29>

I get the following logs when try to run $HADOOP_HOME/bin/start-dfs.sh at
master.

***************************************************************************************************
AT MASTER :
-----------------
2009-06-04 12:16:30,864 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = *******
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.18.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.18 -r 736250;
compiled by 'ndaley' on Thu Jan 22 23:12:08 UTC 2009
************************************************************/
2009-06-04 12:16:31,071 ERROR org.apache.hadoop.dfs.DataNode:
java.io.IOException: Incompatible namespaceIDs in /tmp/*****/dfs/data:
namenode namespaceID = 34351921; datanode namespaceID = 539590337
        at
org.apache.hadoop.dfs.DataStorage.doTransition(DataStorage.java:226)
        at
org.apache.hadoop.dfs.DataStorage.recoverTransitionRead(DataStorage.java:141)
        at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:306)
        at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:223)
        at org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java:3071)
        at
org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:3026)
        at org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:3034)
        at org.apache.hadoop.dfs.DataNode.main(DataNode.java:3156)

2009-06-04 12:16:31,071 INFO org.apache.hadoop.dfs.DataNode: SHUTDOWN_MSG:
/************************************************************
*******************************************************************************************************
AT SLAVE :
----------------
2009-06-04 12:16:28,203 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = ****************
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.18.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.18 -r 736250;
compiled by 'ndaley' on Thu Jan 22 23:12:08 UTC 2009
************************************************************/
2009-06-04 12:16:32,175 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 0 time(s).
2009-06-04 12:16:33,178 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 1 time(s).
2009-06-04 12:16:34,181 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 2 time(s).
2009-06-04 12:16:35,184 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 3 time(s).
2009-06-04 12:16:36,187 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 4 time(s).
2009-06-04 12:16:37,190 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 5 time(s).
2009-06-04 12:16:38,193 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 6 time(s).
2009-06-04 12:16:39,196 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 7 time(s).
2009-06-04 12:16:40,198 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 8 time(s).
2009-06-04 12:16:41,200 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: master/198.55.35.229:54310. Already tried 9 time(s).
2009-06-04 12:16:41,222 ERROR org.apache.hadoop.dfs.DataNode:
java.io.IOException: Call to master/198.55.35.229:54310 failed on local
exception: java.net.NoRouteToHostException: No route to host
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:751)
    at org.apache.hadoop.ipc.Client.call(Client.java:719)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
    at org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:348)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:335)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:372)
    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:309)
    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:286)
    at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:277)
    at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:223)
    at org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java:3071)
    at
org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:3026)
    at org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:3034)
    at org.apache.hadoop.dfs.DataNode.main(DataNode.java:3156)
Caused by: java.net.NoRouteToHostException: No route to host
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
    at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
    at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:301)
    at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:178)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:820)
    at org.apache.hadoop.ipc.Client.call(Client.java:705)
    ... 13 more

2009-06-04 12:16:41,222 INFO org.apache.hadoop.dfs.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at
opencirrus-1262.hpl.hp.com/198.55.36.243
************************************************************/



PLEASE COMMENT.

Thanks.

Asif.


Ravi
--


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message