Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id E9FD3110BE for ; Mon, 24 Mar 2014 14:20:53 +0000 (UTC) Received: (qmail 8491 invoked by uid 500); 24 Mar 2014 14:20:42 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 7481 invoked by uid 500); 24 Mar 2014 14:20:41 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 7458 invoked by uid 99); 24 Mar 2014 14:20:40 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 24 Mar 2014 14:20:40 +0000 X-ASF-Spam-Status: No, hits=1.8 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of jayunit100@gmail.com designates 209.85.215.45 as permitted sender) Received: from [209.85.215.45] (HELO mail-la0-f45.google.com) (209.85.215.45) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 24 Mar 2014 14:20:31 +0000 Received: by mail-la0-f45.google.com with SMTP id hr17so3654148lab.4 for ; Mon, 24 Mar 2014 07:20:10 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=+FFaoICSn9xZUzw07ujKus1egu2uxmIvI2Fvu24YHRw=; b=sTMzthWa8DZx9S9PPUkfg92PFNkGWmRjei+cQ9fQsJeSO0fhcauZ8wpaTR6uf0t6Th 4JIxVz5VSs7kQCSe2zAAMfi5aRFZ89XS5QmLn5wtWAOfhwYIH5PbJxYCrrnj5sH/dKHm BKNJkBqoGcF4G2oRLKNRTwAcBod5Z6fQxBaGvaRtYDSvo3IUmyHSBZWaYp0GJzSYXBmX c27B2ajljx6qLwmWKZAF68uBV7mjppzy+a6TD2cXOwnJaZv262oXF3eYbZFoQC0g51Fz EloWuTXqDd9SbUyT//hqiCCK1OLg385KTIX/myUzqDTCmPTgwKw9OOqAT0txxnx1mZiG JkNQ== MIME-Version: 1.0 X-Received: by 10.152.22.37 with SMTP id a5mr46317557laf.4.1395670810383; Mon, 24 Mar 2014 07:20:10 -0700 (PDT) Received: by 10.112.189.165 with HTTP; Mon, 24 Mar 2014 07:20:10 -0700 (PDT) In-Reply-To: References: Date: Mon, 24 Mar 2014 10:20:10 -0400 Message-ID: Subject: Re: Setting Hadoop on LinuxContainers Fails. From: Jay Vyas To: "common-user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=089e0158b54ee058dd04f55aee78 X-Virus-Checked: Checked by ClamAV on apache.org --089e0158b54ee058dd04f55aee78 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable are your linux containers networked properly (i.e. can they see each other, and the outside world, etc...) www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-= google-com.4428/ On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak wrote: > Hi All, > > I am using linuxcontainer(http://linuxcontainers.org/) for configuring > the hadoop cluster for the testing. > I have create two linux application containers which are called > hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and > with hadoop2 is 10.0.3.201. > > I am able to start the Namenode on 10.0.3.200 but when i try to start the > DataNode on 10.0.3.201 I see the following error at 10.0.3.201 > > > *************************************************************************= *************** > $ hdfs datanode > 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG: > /************************************************************ > STARTUP_MSG: Starting DataNode > STARTUP_MSG: host =3D Hadoop2/10.0.3.148 > STARTUP_MSG: args =3D [] > STARTUP_MSG: version =3D 2.2.0 > STARTUP_MSG: classpath =3D > /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/had= oop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Inst= alled/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.j= ar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-serve= r-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr30= 5-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jac= kson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/shar= e/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/sh= are/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/= hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5= .23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeep= er-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/je= rsey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/l= ib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/com= mon/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/c= ommon/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Install= ed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/= home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.2= 6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-b= eanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/co= mmon/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/had= oop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu= /Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/In= stalled/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/I= nstalled/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/h= ome/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.j= ar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-= 1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jet= ty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/h= adoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/commo= n/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/I= nstalled/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/h= ome/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.j= ar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.= 7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-ap= i-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xml= enc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop= -nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hado= op-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/ha= doop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet= -api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/comm= ons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/co= mmons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hd= fs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hd= fs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop= -2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed= /hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubun= tu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/= home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5= .23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6= .2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-= 3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-c= li-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/common= s-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-= api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty= -6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jacks= on-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs= /lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/= hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share= /hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop= -2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/had= oop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/I= nstalled/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/h= ome/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/h= ome/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compres= s-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoo= p-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ya= rn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share= /hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.j= ar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server= -web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/= hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed= /hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodem= anager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hado= op-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hado= op-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.= jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.in= ject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/l= og4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/= lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/sha= re/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/= home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1= .9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-= 1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/ao= palliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduc= e/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce= /lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Insta= lled/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotati= ons-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/li= b/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/had= oop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-clie= nt-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce= /hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client= -jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapre= duce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/h= ome/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-c= lient-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib= /capacity-scheduler/*.jar > STARTUP_MSG: build =3D https://svn.apache.org/repos/asf/hadoop/common -= r > 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z > STARTUP_MSG: java =3D 1.7.0 > ************************************************************/ > 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers > for [TERM, HUP, INT] > 14/03/24 09:30:57 WARN common.Util: Path > /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI = in > configuration files. Please update hdfs configuration. > 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from > hadoop-metrics2.properties > 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period > at 10 second(s). > 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system > started > 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2 > 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at / > 0.0.0.0:50010 > 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576 > bytes/s > 14/03/24 09:30:58 INFO mortbay.log: Logging to > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via > org.mortbay.log.Slf4jLog > 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety' > (class=3Dorg.apache.hadoop.http.HttpServer$QuotingInputFilter) > 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter > (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)= to > context datanode > 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter > (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)= to > context logs > 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter > (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)= to > context static > 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at > localhost:50075 > 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled =3D false > 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075 > 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26 > 14/03/24 09:30:59 INFO mortbay.log: Started > SelectChannelConnector@localhost:50075 > 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 500= 20 > 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at / > 0.0.0.0:50020 > 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for > nameservices: null > 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for > nameservices: > 14/03/24 09:30:59 WARN common.Util: Path > /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI = in > configuration files. Please update hdfs configuration. > 14/03/24 09:30:59 INFO datanode.DataNode: Block pool > (storage id unknown) service to /10.0.3.200:9000 starting to offer servic= e > 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting > 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting > 14/03/24 09:30:59 INFO common.Storage: Lock on > /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by > nodename 2618@Hadoop2 > 14/03/24 09:31:00 INFO common.Storage: Locking is disabled > 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage: > nsid=3D1367523242;bpid=3DBP-1489452897-10.0.3.253-1395650301038;lv=3D-47;= nsInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=3D136= 7523242;c=3D0;bpid=3DBP-1489452897-10.0.3.253-1395650301038 > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume - > /home/ubuntu/dallaybatta-data/hdfs/datanode/current > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBea= n > 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree > Verification scan starting at 1395674259100 with interval 21600000 > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool > BP-1489452897-10.0.3.253-1395650301038 > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool > BP-1489452897-10.0.3.253-1395650301038 on volume > /home/ubuntu/dallaybatta-data/hdfs/datanode/current... > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool > BP-1489452897-10.0.3.253-1395650301038 on > /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replica= s > for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for > block pool BP-1489452897-10.0.3.253-1395650301038 on volume > /home/ubuntu/dallaybatta-data/hdfs/datanode/current... > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map fo= r > block pool BP-1489452897-10.0.3.253-1395650301038 on volume > /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas > to map: 1ms > 14/03/24 09:31:00 INFO datanode.DataNode: Block pool > BP-1489452897-10.0.3.253-1395650301038 (storage id > DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000= beginning handshake with NN > 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for bloc= k > pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id > DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000 > org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.proto= col.DisallowedDatanodeException): > Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0= , > storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D5007= 5, > ipcPort=3D50020, > storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsi= d=3D1367523242;c=3D0) > at > org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDat= anode(DatanodeManager.java:739) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNa= mesystem.java:3929) > at > org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode= (NameNodeRpcServer.java:948) > at > org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.= registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90) > at > org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProt= ocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Pr= otobufRpcEngine.java:585) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1491) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) > > at org.apache.hadoop.ipc.Client.call(Client.java:1347) > at org.apache.hadoop.ipc.Client.call(Client.java:1300) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.= java:206) > at $Proxy9.registerDatanode(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:601) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:186) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:102) > at $Proxy9.registerDatanode(Unknown Source) > at > org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.= registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146) > at > org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceA= ctor.java:623) > at > org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHands= hake(BPServiceActor.java:225) > at > org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.= java:664) > at java.lang.Thread.run(Thread.java:722) > 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for: > Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id > DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000 > 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool > BP-1489452897-10.0.3.253-1395650301038 (storage id > DS-1380795562-10.0.3.201-50010-1395650455122) > 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed > bpid=3DBP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap > 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool > BP-1489452897-10.0.3.253-1395650301038 > 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode > 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0 > 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG: > /************************************************************ > *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148 > * > ************************************************************/ > > > *************************************************************************= *************** > > > And here is the corresponding error coming at NameNode( 10.0.3.201) > > > *************************************************************************= *************** > 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved > datanode registration from 10.0.3.201 > 14/03/24 09:31:00 ERROR security.UserGroupInformation: > PriviledgedActionException as:ubuntu (auth:SIMPLE) > cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: > Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0= , > storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D5007= 5, > ipcPort=3D50020, > storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsi= d=3D1367523242;c=3D0) > 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call > org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode > from 10.0.3.201:60951 Call#1 Retry#0: error: > org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: > Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0= , > storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D5007= 5, > ipcPort=3D50020, > storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsi= d=3D1367523242;c=3D0) > org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: > Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0= , > storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D5007= 5, > ipcPort=3D50020, > storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsi= d=3D1367523242;c=3D0) > at > org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDat= anode(DatanodeManager.java:739) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNa= mesystem.java:3929) > at > org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode= (NameNodeRpcServer.java:948) > at > org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.= registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90) > at > org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProt= ocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Pr= otobufRpcEngine.java:585) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1491) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) > > *************************************************************************= *************** > > I don't know from where *10.0.3.148 * > ip is coming yet, could be due to some lxc configurations. What can be > interpreted from the hadoop error information? > > Let me know if you need more info about my environment to provide some > insights. > > Regards, > Vicky > > > > > --=20 Jay Vyas http://jayunit100.blogspot.com --089e0158b54ee058dd04f55aee78 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
are your linux containers networked properly (i.e. can the= y see each other, and the outside world, etc...) www.linux.org/threads/linux-containers-part-4-getting-to-the-univ= erse-ping-google-com.4428/


On Mon,= Mar 24, 2014 at 6:02 AM, Vicky Kak <vicky.kak@gmail.com> = wrote:
Hi All,

I am u= sing linuxcontainer(http://linuxcontainers.org/) for configuring the hadoop cluster for = the testing.
I have create two linux application containers which are called hadoo= p1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and with= hadoop2 is 10.0.3.201.

I am able to start the Namenode on 10.0.3.200 but when i try to s= tart the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
=
***********************************************************************= *****************
$ hdfs datanode
14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG: <= br>/************************************************************
STARTUP= _MSG: Starting DataNode
STARTUP_MSG:=A0=A0 host =3D Hadoop2/10.0.3.148
STARTUP_MSG:=A0=A0 args =3D []
STARTUP_MSG:=A0=A0 version =3D 2.2.0
S= TARTUP_MSG:=A0=A0 classpath =3D /home/ubuntu/Installed/hadoop-2.2.0/etc/had= oop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api= -2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/common= s-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/co= mmons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/co= mmon/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/ha= doop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubun= tu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2= .1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-= codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/p= rotobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/com= mon/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hado= op-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed= /hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Install= ed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/= Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpc= lient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/x= z-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commo= ns-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/com= mon/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/sha= re/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/In= stalled/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-= 5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/nett= y-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/l= ib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/= junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib= /commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/commo= n/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/comm= on/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu= /Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.= 2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-d= igester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib= /jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/I= nstalled/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.= 5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-co= re-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/common= s-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/l= ib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/h= dfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/= hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Insta= lled/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubunt= u/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.= 1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.= jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mappe= r-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/p= rotobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yar= n/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ya= rn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/y= arn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn= /lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yar= n/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yar= n/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/y= arn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubunt= u/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications= -unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.= 2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-s= erver-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn= /hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.j= ar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2= .2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-= server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ma= preduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share= /hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop= -2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl= -1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/p= rotobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/map= reduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/sh= are/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hado= op-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Inst= alled/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-= 1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/= commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.= jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapre= duce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/m= apreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-cl= ient-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/map= reduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/= home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-= client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/= Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
STARTUP_MSG:=A0=A0 build =3D https://svn.apache.org/repos/asf/hadoop/comm= on -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
S= TARTUP_MSG:=A0=A0 java =3D 1.7.0
************************************************************/
14/03/24 0= 9:30:57 INFO datanode.DataNode: registered UNIX signal handlers for [TERM, = HUP, INT]
14/03/24 09:30:57 WARN common.Util: Path /home/ubuntu/dallayba= tta-data/hdfs/datanode should be specified as a URI in configuration files.= Please update hdfs configuration.
14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from hadoop-me= trics2.properties
14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Schedul= ed snapshot period at 10 second(s).
14/03/24 09:30:58 INFO impl.MetricsS= ystemImpl: DataNode metrics system started
14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop214/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /0.0.0.0:50010
14/03/2= 4 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576 bytes/s 14/03/24 09:30:58 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAd= apter(org.mortbay.log) via org.mortbay.log.Slf4jLog
14/03/24 09:30:58 IN= FO http.HttpServer: Added global filter 'safety' (class=3Dorg.apach= e.hadoop.http.HttpServer$QuotingInputFilter)
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (cl= ass=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to c= ontext datanode
14/03/24 09:30:58 INFO http.HttpServer: Added filter sta= tic_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$Sta= ticUserFilter) to context logs
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (cl= ass=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to c= ontext static
14/03/24 09:30:58 INFO datanode.DataNode: Opened info serv= er at localhost:50075
14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled =3D false
= 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
14/03/= 24 09:30:58 INFO mortbay.log: jetty-6.1.26
14/03/24 09:30:59 INFO mortba= y.log: Started SelectChannelConnector@localhost:50075
14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020=
14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /0.0.0.0:50020
14/03/24 0= 9:30:59 INFO datanode.DataNode: Refresh request received for nameservices: = null
14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for name= services: <default>
14/03/24 09:30:59 WARN common.Util: Path /home= /ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in conf= iguration files. Please update hdfs configuration.
14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering> (s= torage id unknown) service to /10.0.3.200:9000 starting to offer service
14/03/24 09:30:59= INFO ipc.Server: IPC Server Responder: starting
14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting14/03/24 09:30:59 INFO common.Storage: Lock on /home/ubuntu/dallaybatta-d= ata/hdfs/datanode/in_use.lock acquired by nodename 2618@Hadoop2
14/03/24= 09:31:00 INFO common.Storage: Locking is disabled
14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage: nsid=3D136752= 3242;bpid=3DBP-1489452897-10.0.3.253-1395650301038;lv=3D-47;nsInfo=3Dlv=3D-= 47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=3D1367523242;c=3D0;b= pid=3DBP-1489452897-10.0.3.253-1395650301038
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume - /home/ubuntu/dall= aybatta-data/hdfs/datanode/current
14/03/24 09:31:00 INFO impl.FsDataset= Impl: Registered FSDatasetState MBean
14/03/24 09:31:00 INFO datanode.Di= rectoryScanner: Periodic Directory Tree Verification scan starting at 13956= 74259100 with interval 21600000
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool BP-1489452897-= 10.0.3.253-1395650301038
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scan= ning block pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubun= tu/dallaybatta-data/hdfs/datanode/current...
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool BP= -1489452897-10.0.3.253-1395650301038 on /home/ubuntu/dallaybatta-data/hdfs/= datanode/current: 11ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total = time to scan all replicas for block pool BP-1489452897-10.0.3.253-139565030= 1038: 13ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for block= pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubuntu/dallayb= atta-data/hdfs/datanode/current...
14/03/24 09:31:00 INFO impl.FsDataset= Impl: Time to add replicas to map for block pool BP-1489452897-10.0.3.253-1= 395650301038 on volume /home/ubuntu/dallaybatta-data/hdfs/datanode/current:= 0ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas t= o map: 1ms
14/03/24 09:31:00 INFO datanode.DataNode: Block pool BP-14894= 52897-10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1= 395650455122) service to /10.0.3.200:9000 beginning handshake with NN
14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block = pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-13807= 95562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protoco= l.DisallowedDatanodeException): Datanode denied communication with namenode= : DatanodeRegistration(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-= 1395650455122, infoPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;ci= d=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=3D1367523242;c=3D0)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.= registerDatanode(DatanodeManager.java:739)
=A0=A0=A0 at org.apache.hadoo= p.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929= )
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.= registerDatanode(NameNodeRpcServer.java:948)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTr= anslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)=
=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProt= os$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java= :24079)
=A0=A0=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker.call(ProtobufRpcEngine.java:585)
=A0=A0=A0 at org.apache.hadoop.ipc.= RPC$Server.call(RPC.java:928)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:2048)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)=A0=A0=A0 at java.security.AccessController.doPrivileged(Native Method)=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=A0=A0= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1491)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
=
=A0=A0=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1347)
=A0= =A0=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1300)
=A0=A0=A0 = at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine= .java:206)
=A0=A0=A0 at $Proxy9.registerDatanode(Unknown Source)
=A0=A0=A0 at sun.r= eflect.NativeMethodAccessorImpl.invoke0(Native Method)
=A0=A0=A0 at sun.= reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)=A0=A0=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)
=A0=A0=A0 at java.lang.reflect.Method.invoke(Method.java:601)
=A0=A0=A0 = at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvo= cationHandler.java:186)
=A0=A0=A0 at org.apache.hadoop.io.retry.RetryInv= ocationHandler.invoke(RetryInvocationHandler.java:102)
=A0=A0=A0 at $Proxy9.registerDatanode(Unknown Source)
=A0=A0=A0 at org.a= pache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registe= rDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
=A0=A0=A0 at = org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceAct= or.java:623)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectT= oNNAndHandshake(BPServiceActor.java:225)
=A0=A0=A0 at org.apache.hadoop.= hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
=A0=A0= =A0 at java.lang.Thread.run(Thread.java:722)
14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for: Bl= ock pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-1380795562-1= 0.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool BP-1489452897-= 10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1395650= 455122)
14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed bpid= =3DBP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool BP-148945289= 7-10.0.3.253-1395650301038
14/03/24 09:31:02 WARN datanode.DataNode: Exi= ting Datanode
14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status = 0
14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/*****************************************= *******************
SHUTDOWN_MSG: Shutting dow= n DataNode at Hadoop2/10.0.= 3.148
***************************************************= *********/

***************************************************************************= *************


And here is the corresponding error coming a= t NameNode( 10.0.3.201)

********************************************= ********************************************
14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved datanode= registration from 10.0.3.201
14/03/24 09:31:00 ERROR security.UserGroup= Information: PriviledgedActionException as:ubuntu (auth:SIMPLE) cause:org.a= pache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode den= ied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=3D= DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPort=3D5= 0020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce= ;nsid=3D1367523242;c=3D0)
14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call org.a= pache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode from 10.0.3.201:60951 Ca= ll#1 Retry#0: error: org.apache.hadoop.hdfs.server.protocol.DisallowedDatan= odeException: Datanode denied communication with namenode: DatanodeRegistra= tion(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, inf= oPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-e= beb-4d52-9ead-4e65f49246ce;nsid=3D1367523242;c=3D0)
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanod= e denied communication with namenode: DatanodeRegistration(0.0.0.0, storage= ID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPor= t=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49= 246ce;nsid=3D1367523242;c=3D0)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.= registerDatanode(DatanodeManager.java:739)
=A0=A0=A0 at org.apache.hadoo= p.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929= )
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.= registerDatanode(NameNodeRpcServer.java:948)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTr= anslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)=
=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProt= os$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java= :24079)
=A0=A0=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker.call(ProtobufRpcEngine.java:585)
=A0=A0=A0 at org.apache.hadoop.ipc.= RPC$Server.call(RPC.java:928)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:2048)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)=A0=A0=A0 at java.security.AccessController.doPrivileged(Native Method)=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=A0=A0= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1491)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
= ***************************************************************************= *************

I don't know from where 10.0.3.148
=A0ip is coming yet, could be due to some lxc configurations. What can be i= nterpreted from the hadoop error information?

Let me know= if you need more info about my environment to provide some insights.

Regards,
Vicky






--
Jay Vyas
http://jayunit100.blogspo= t.com
--089e0158b54ee058dd04f55aee78--