Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id CAD3210A32 for ; Wed, 8 Jan 2014 23:52:20 +0000 (UTC) Received: (qmail 47393 invoked by uid 500); 8 Jan 2014 23:52:15 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 47301 invoked by uid 500); 8 Jan 2014 23:52:15 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 47294 invoked by uid 99); 8 Jan 2014 23:52:15 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 08 Jan 2014 23:52:15 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of xeonmailinglist@gmail.com designates 209.85.217.179 as permitted sender) Received: from [209.85.217.179] (HELO mail-lb0-f179.google.com) (209.85.217.179) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 08 Jan 2014 23:52:11 +0000 Received: by mail-lb0-f179.google.com with SMTP id w7so1827879lbi.10 for ; Wed, 08 Jan 2014 15:51:49 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=Pq32gOaHmHLw/PegO+oqJSMBghhsfS+oNoed9XTViqY=; b=BDT8g2IdVlVsYHDFyRte5q5cCxZVWYxcqVg9X+et2UoAq3vPsG5CHfc0vRfe5TIzUF NRpcwB5EXyBVZ42hX3XrWw47/5yQxpza2f0DmOBK4w1jFqaUveBPvFbEP3ma9TgizMgp lsSPPdRTa3GXzaw5XfavznpCazzmtonee9VSF16v0m3gh2zZFw5ZGUMJfUuC7gO2RoC+ xF+tCDQWObWkivkMFLrAp7U93C+4244nyvZ3OFfGPGyWO8i8IWqlIUSux/xu0tbV4WBa 65fzCU6yYfMZ1DE4Sph2Rw4dgSkzSRr/Hzup1HaG6HHZPXSOKB2/HckqXU4oHaYNKwEJ hRJg== MIME-Version: 1.0 X-Received: by 10.152.87.211 with SMTP id ba19mr51202679lab.13.1389225109506; Wed, 08 Jan 2014 15:51:49 -0800 (PST) Received: by 10.112.38.73 with HTTP; Wed, 8 Jan 2014 15:51:49 -0800 (PST) Date: Wed, 8 Jan 2014 23:51:49 +0000 Message-ID: Subject: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException From: xeon Mailinglist To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c225d02a4d6304ef7e2d93 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c225d02a4d6304ef7e2d93 Content-Type: text/plain; charset=ISO-8859-1 When I try to launch the namenode and the datanode in MRv2, the datanode can't connect to the namenode, giving me the error below. I also put the core-site file that I use below. The Firewall in the hosts is disabled. I don't have excluded nodes defined. Why the datanodes can't connect to the namenode? Any help to solve this problem? org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException): Datanode denied communication with namenode: DatanodeRegistrati on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955, infoPort=50075, ipcPort=50020, storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9 34416283;c=0) at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90) at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735) at org.apache.hadoop.ipc.Client.call(Client.java:1235) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:622) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664) at java.lang.Thread.run(Thread.java:701) I set the core-site.xml fs.default.name hdfs://10.103.0.17:9000 hadoop.tmp.dir /tmp/hadoop-temp hadoop.proxyuser.root.hosts* hadoop.proxyuser.root.groups* --001a11c225d02a4d6304ef7e2d93 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable

When I try to launch the namenode and the d= atanode in MRv2, the datanode can't connect to the namenode, giving me = the error below. I also put the core-site file that I use below.=A0
The Firewall in the hosts is disabled. I don't have excluded= nodes defined. Why the datanodes can't connect to the namenode? =A0Any= help to solve this problem?


org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protoco= l.DisallowedDatanodeException): Datanode denied communication with namenode= : DatanodeRegistrati
on(0.0.0.0, storageID=3DDS-1449645935-172.16= .1.10-50010-1389224474955, infoPort=3D50075, ipcPort=3D50020, storageInfo= =3Dlv=3D-40;cid=3DCID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=3D9
34416283;c=3D0)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.se= rver.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:= 631)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FS= Namesystem.registerDatanode(FSNamesystem.java:3398)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcS= erver.registerDatanode(NameNodeRpcServer.java:881)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslat= orPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocol.proto.DatanodeProto= colProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProt= os.java:18295)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.ProtobufR= pcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014= )
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(S= erver.java:1741)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:1737)
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(Native = Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subje= ct.java:416)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGr= oupInformation.doAs(UserGroupInformation.java:1478)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.jav= a:1735)

=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.C= lient.call(Client.java:1235)
=A0 =A0 =A0 =A0 at org.apache.hadoop= .ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
=A0 =A0 =A0 =A0 at com.sun.proxy.$Proxy9.registerDatanode(Unknown Sour= ce)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invok= e0(Native Method)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAcce= ssorImpl.invoke(NativeMethodAccessorImpl.java:57)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Del= egatingMethodAccessorImpl.java:43)
=A0 =A0 =A0 =A0 at java.lang.r= eflect.Method.invoke(Method.java:622)
=A0 =A0 =A0 =A0 at org.apac= he.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandl= er.java:164)
=A0 =A0 =A0 =A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.i= nvoke(RetryInvocationHandler.java:83)
=A0 =A0 =A0 =A0 at com.sun.= proxy.$Proxy9.registerDatanode(Unknown Source)
=A0 =A0 =A0 =A0 at= org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.r= egisterDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.datanode.BPServiceAct= or.register(BPServiceActor.java:623)
=A0 =A0 =A0 =A0 at org.apach= e.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServ= iceActor.java:225)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.datanode.BPServiceAct= or.run(BPServiceActor.java:664)
=A0 =A0 =A0 =A0 at java.lang.Thre= ad.run(Thread.java:701)

I set the core= -site.xml

<configuration>
=A0 <property&g= t; <name>fs.default.name</n= ame> <value>hdfs://10.103.0.17= :9000</value> </property>
=A0 <property> <name>hadoop.tmp.dir</name> <value= >/tmp/hadoop-temp</value> </property>
=A0 <prop= erty><name>hadoop.proxyuser.root.hosts</name><value>*&= lt;/value></property>
=A0 <property><name>hadoop.proxyuser.root.groups</name&= gt;<value>*</value></property>
</configurati= on>

--001a11c225d02a4d6304ef7e2d93--