hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vikas Srivastava <vikas.srivast...@one97.net>
Subject Re: Decommission of datanode(Urgent)
Date Tue, 20 Sep 2011 06:03:50 GMT
Hey!!

i m using 0.20.2 apache hadoop and hive 0.7.0 ,

i thing i would like to ask is that  i have to do make exclude file on all
the datanodes or only on NAMENODE. if on NN then i did that but not working.

i did that watever u said but its not working actually wat happen is all
lives nodes went to 0 (became dead)
and NN logs are like

*5: After that my live nodes became 0 and all nodes became dead.. i checked
namenode logs where i found these error msgs*

2011-09-19 12:33:47,695 INFO org.apache.hadoop.ipc.Server: IPC Server
handler 24 on 9000, call sendHeartbeat(
DatanodeRegistration(10.0.3.16:50010,
storageID=DS-1703098060-10.0.3.16-50010-1298269611944, infoPort=50075,
ipcPort=50020), 2012206694400, 1650194042865, 271003275264, 0, 1) from
10.0.3.16:38587: error:
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
denied communication with namenode: 10.0.3.16:50010
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
denied communication with namenode: 10.0.3.16:50010
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.handleHeartbeat(FSNamesystem.java:2235)
        at
org.apache.hadoop.hdfs.server.namenode.NameNode.sendHeartbeat(NameNode.java:704)
        at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
2011-09-19 12:33:47,701 INFO org.apache.hadoop.ipc.Server: IPC Server
handler 7 on 9000, call sendHeartbeat(DatanodeRegistration(10.0.5.36:50010,
storageID=DS-809855347-10.0.5.36-50010-1316252293924, infoPort=50075,
ipcPort=50020), 1938687860736, 1390486994944, 457712619520, 0, 1) from
10.0.5.36:58924: error:
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
denied communication with namenode: 10.0.5.36:50010
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
denied communication with namenode: 10.0.5.36:50010
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.handleHeartbeat(FSNamesystem.java:2235)
        at
org.apache.hadoop.hdfs.server.namenode.NameNode.sendHeartbeat(NameNode.java:704)
        at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)


On Mon, Sep 19, 2011 at 7:01 PM, Siddharth Tiwari <siddharth.tiwari@live.com
> wrote:

>  Hi Vikas,
>
> oh I see, then  it should work as u per ur configuration. What you can try
> is, first stop the services and then edit the conf file, restart the engine
> again, note ur exclude file must not have any entry to start with, after
> starting hadoop with 0 byte exclude, go and put the full IP and port and
> then refreshnode. It should work.
>
>
> **------------------------**
> *Cheers !!!*
> *Siddharth Tiwari*
> *TCS world wide Data warehouse and Analytic Team - Americas*
> Have a refreshing day !!!
>
>
> ------------------------------
> Date: Mon, 19 Sep 2011 18:51:28 +0530
> Subject: Re: Decommission of datanode(Urgent)
>
> From: vikas.srivastava@one97.net
> To: user@hive.apache.org
>
> hey sid ,
>
> thanks but i tried for the same also..
>
> 1: first  stop cluster
>
> *"please make note that this file must be included at the namenode
> startup"*
>
> is this wat you mean is to add file at *dfs.hosts.exclude* in hdfs-site
> and
>
> after that start-dfs.sh ,start-mapred.sh
>
> and after starting server , i need to add the ip:port at exlude file ...
>
> then cmd "*-refreshNodes*"
>
> regards
> Vikas
>
>
> On Mon, Sep 19, 2011 at 6:40 PM, Siddharth Tiwari <
> siddharth.tiwari@live.com> wrote:
>
>  Hi vikas,
>
> Please include your exclude file in *hdfs-site.xml* under the property*dfs.hosts.exclude
> *.
> please make note that this file must be included at the namenode startup
> and then u hv to edit it for the datanodes you want to exclude. Secondly,
> after editin it for the ip:port of the retiring datanode, use the command
> *bin/hadoop dfsadmin -refreshNodes*.
>
> also, Decommission does not happen momentarily since it requires
> replication of potentially a large number of blocks and we do not want the
> cluster to be overwhelmed with just this one job. The decommission progress
> can be monitored on the name-node Web UI. Until all blocks are replicated
> the node will be in "Decommission In Progress" state. When decommission is
> done the state will change to "Decommissioned". The nodes can be removed
> whenever decommission is finished.
>
> The decommission process can be terminated at any time by editing the
> configuration or the exclude files and repeating the *-refreshNodes*command.
>
> hope it helps.
>
> **------------------------**
> *Cheers !!!*
> *Siddharth Tiwari*
> *TCS world wide Data warehouse and Analytic Team - Americas*
> Have a refreshing day !!!
>
>
> ------------------------------
> Date: Mon, 19 Sep 2011 12:50:13 +0530
> Subject: Decommission of datanode(Urgent)
> From: vikas.srivastava@one97.net
> To: user@hive.apache.org; ayonsinha@yahoo.com; viral.bajaria@gmail.com
> CC: nitin2.kumar@one97.net; abhinav.mishra@one97.net
>
>
> Hey folks !!
>
> i tried to decommission datanode from hadoop cluster... steps i followed
>
> 1: add  this in core site
>
> * <property>
>     <name>dfs.hosts.exclude</name>
>     <value>/home/hadoop/excludes</value>
>     <final>true</final>
>   </property>
> *
> 1: add  this in mapred-site
> *  <property>
>     <name>mapred.hosts.exclude</name>
>
>
>     <value>/home/hadoop/excludes</value>
>     <final>true</final>
>   </property>*
>
>
> 3:create a excludes file and add *ip:port* in that
>
> exp: *10.0.3.31:50010*
>
> 4: run cmd
>
> *hadoop dfsadmin -refreshNodes*
>
>
> *5: After that my live nodes became 0 and all nodes became dead.. i
> checked namenode logs where i found these error msgs*
>
> 2011-09-19 12:33:47,695 INFO org.apache.hadoop.ipc.Server: IPC Server
> handler 24 on 9000, call sendHeartbeat(DatanodeRegistration(
> 10.0.3.16:50010, storageID=DS-1703098060-10.0.3.16-50010-1298269611944,
> infoPort=50075, ipcPort=50020), 2012206694400, 1650194042865, 271003275264,
> 0, 1) from 10.0.3.16:38587: error:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
> denied communication with namenode: 10.0.3.16:50010
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: 10.0.3.16:50010
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.handleHeartbeat(FSNamesystem.java:2235)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNode.sendHeartbeat(NameNode.java:704)
>         at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
> 2011-09-19 12:33:47,701 INFO org.apache.hadoop.ipc.Server: IPC Server
> handler 7 on 9000, call sendHeartbeat(DatanodeRegistration(10.0.5.36:50010,
> storageID=DS-809855347-10.0.5.36-50010-1316252293924, infoPort=50075,
> ipcPort=50020), 1938687860736, 1390486994944, 457712619520, 0, 1) from
> 10.0.5.36:58924: error:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
> denied communication with namenode: 10.0.5.36:50010
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: 10.0.5.36:50010
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.handleHeartbeat(FSNamesystem.java:2235)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNode.sendHeartbeat(NameNode.java:704)
>         at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>
>
>
> please suggest any help would be appreciated!!!!!!!!!!
>
>
> --
> With Regards
> Vikas Srivastava
>
> DWH & Analytics Team
> Mob:+91 9560885900
> One97 | Let's get talking !
>
>
>
>
> --
> With Regards
> Vikas Srivastava
>
> DWH & Analytics Team
> Mob:+91 9560885900
> One97 | Let's get talking !
>
>


-- 
With Regards
Vikas Srivastava

DWH & Analytics Team
Mob:+91 9560885900
One97 | Let's get talking !

Mime
View raw message