hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shekhar Sharma <shekhar2...@gmail.com>
Subject Re: Datanode Shutting down automatically
Date Sat, 25 Jan 2014 07:09:56 GMT
Incompatible name space Id error.its because dat u might have formatted the
namenode but data nodes folder is still have the same I'd.

What is the value of the following property
dfs. Data.dir
dfs. name.dir
hadoop.tmp.dir

The value of these properties is directory on local file system

Solution is to open the version file of the name node (under
dfs/name/current folder) you will see the first line as name space Id, copy
that and then open version file of datanode which is not coming up
(dfs/data/ current folder) you copy name space Id. Now start the data node
process

A dirty hack is to delete the folders(directories specified by the above
properties) from all the machines and den format ur name node and start ur
process. Please note dat the previous data will be lost
On 25 Jan 2014 11:55, "Pranav Gadekar" <ppgadekar.92@gmail.com> wrote:

> This is my log file.
>
> 014-01-24 17:24:58,238 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = user/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.2.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r
> 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013
> STARTUP_MSG:   java = 1.6.0_27
> ************************************************************/
> 2014-01-24 17:24:58,622 INFO
> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 2014-01-24 17:24:58,669 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2014-01-24 17:24:58,670 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2014-01-24 17:24:58,670 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2014-01-24 17:24:58,877 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2014-01-24 17:24:58,880 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2014-01-24 17:25:10,778 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 102782159; datanode namespaceID = 1227483104
>     at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:414)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:321)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1712)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1651)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1669)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1795)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1812)
>
> 2014-01-24 17:25:10,779 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at user/127.0.1.1
> ************************************************************/
> 2014-01-24 17:26:13,413 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = user/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.2.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r
> 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013
> STARTUP_MSG:   java = 1.6.0_27
> ************************************************************/
> 2014-01-24 17:26:13,510 INFO
> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 2014-01-24 17:26:13,518 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2014-01-24 17:26:13,518 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2014-01-24 17:26:13,518 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2014-01-24 17:26:13,626 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2014-01-24 17:26:13,628 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2014-01-24 17:26:28,860 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 102782159; datanode namespaceID = 1227483104
>     at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:414)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:321)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1712)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1651)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1669)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1795)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1812)
>
> 2014-01-24 17:26:28,860 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at user/127.0.1.1
> ************************************************************/
>
>
> Shutdown command is initialised automatically.
> Please take a look and respond with a solution.
>

Mime
View raw message