Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 98FAE10917 for ; Mon, 24 Mar 2014 10:03:26 +0000 (UTC) Received: (qmail 26281 invoked by uid 500); 24 Mar 2014 10:03:16 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 26193 invoked by uid 500); 24 Mar 2014 10:03:16 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 26186 invoked by uid 99); 24 Mar 2014 10:03:15 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 24 Mar 2014 10:03:15 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of vicky.kak@gmail.com designates 209.85.160.173 as permitted sender) Received: from [209.85.160.173] (HELO mail-yk0-f173.google.com) (209.85.160.173) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 24 Mar 2014 10:03:07 +0000 Received: by mail-yk0-f173.google.com with SMTP id 10so14039682ykt.4 for ; Mon, 24 Mar 2014 03:02:45 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=fdYiuFHChEo63KOkYwjb2C+kVzaw/9BIUTTg5hj2iGI=; b=cNoGkeoLdzniwGhw2yr4FIHZ21+OzEez3Q9HpV97svlMkRIzHLg8xmxqejZP88UNKH 2xM60HWt6Cb2Xy3mBV3qxYxWpTj/qBJetvRPa178U92JnTfosqC9fl6ZDyqXvXRbmUae 6D5pTHTZy1EQ8mPRQW+QW+IpbBcrSQtRrFH0uLv87x2yG26PwcrSBYxD3UegAEXuqIPF doUvnt8C5uczaPH3Av14ccOe7LiOw+x+kO73HCoHlwbWAa4iL0rPuxFzSlw4hqtz6xxp 4UlFRDLGNVjNB9yk3bkweQQRIUXoK7UDtBLXBFKeXjw4hZEwwQz5GTAXQJZ8/ubGUNVm Q+Fg== MIME-Version: 1.0 X-Received: by 10.236.140.16 with SMTP id d16mr11963381yhj.55.1395655365164; Mon, 24 Mar 2014 03:02:45 -0700 (PDT) Received: by 10.170.186.131 with HTTP; Mon, 24 Mar 2014 03:02:45 -0700 (PDT) In-Reply-To: References: Date: Mon, 24 Mar 2014 15:32:45 +0530 Message-ID: Subject: Fwd: Setting Hadoop on LinuxContainers Fails. From: Vicky Kak To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=20cf303ea52045073804f557567b X-Virus-Checked: Checked by ClamAV on apache.org --20cf303ea52045073804f557567b Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hi All, I am using linuxcontainer(http://linuxcontainers.org/) for configuring the hadoop cluster for the testing. I have create two linux application containers which are called hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and with hadoop2 is 10.0.3.201. I am able to start the Namenode on 10.0.3.200 but when i try to start the DataNode on 10.0.3.201 I see the following error at 10.0.3.201 ***************************************************************************= ************* $ hdfs datanode 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host =3D Hadoop2/10.0.3.148 STARTUP_MSG: args =3D [] STARTUP_MSG: version =3D 2.2.0 STARTUP_MSG: classpath =3D /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/= hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/= home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-= 1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-= 1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jacks= on-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/co= mmon/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/shar= e/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/ha= doop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Insta= lled/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.2= 3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper= -3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jers= ey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib= /jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/commo= n/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/com= mon/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/had= oop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Insta= lled/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed= /hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.= jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-bea= nutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/comm= on/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/I= nstalled/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Inst= alled/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubunt= u/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubun= tu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.= 0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty= -6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/had= oop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/= lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/ha= doop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/had= oop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.= 5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-= 1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlen= c-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-n= fs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop= -common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/co= mmon/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-a= pi-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/common= s-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/comm= ons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs= /lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs= /lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu= /Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.2= 3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2= .Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.= 2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli= -1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-= io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-ap= i-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6= .1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson= -core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/l= ib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/ha= doop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hd= fs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/h= dfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/sh= are/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/= share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/had= oop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubun= tu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-= 1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-= annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn= /lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-w= eb-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/ha= doop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubun= tu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodeman= ager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop= -yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop= -2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/= hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inje= ct-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log= 4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/li= b/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ma= preduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share= /hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/sh= are/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9= .jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.= 7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopa= lliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/= lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/l= ib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ma= preduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/sha= re/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/sha= re/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Install= ed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home= /ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotation= s-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/= jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client= -app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/h= adoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-j= obclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapredu= ce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-cli= ent-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/c= apacity-scheduler/*.jar STARTUP_MSG: build =3D https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z STARTUP_MSG: java =3D 1.7.0 ************************************************************/ 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 14/03/24 09:30:57 WARN common.Util: Path /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in configuration files. Please update hdfs configuration. 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system started 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at / 0.0.0.0:50010 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576 bytes/s 14/03/24 09:30:58 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety' (class=3Dorg.apache.hadoop.http.HttpServer$QuotingInputFilter) 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) t= o context datanode 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) t= o context logs 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) t= o context static 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at localhost:50075 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled =3D false 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26 14/03/24 09:30:59 INFO mortbay.log: Started SelectChannelConnector@localhos= t :50075 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at / 0.0.0.0:50020 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for nameservices: null 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for nameservices: 14/03/24 09:30:59 WARN common.Util: Path /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in configuration files. Please update hdfs configuration. 14/03/24 09:30:59 INFO datanode.DataNode: Block pool (storage id unknown) service to /10.0.3.200:9000 starting to offer service 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting 14/03/24 09:30:59 INFO common.Storage: Lock on /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by nodename 2618@Hadoop2 14/03/24 09:31:00 INFO common.Storage: Locking is disabled 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage: nsid=3D1367523242;bpid=3DBP-1489452897-10.0.3.253-1395650301038;lv=3D-47;ns= Info=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=3D13675= 23242;c=3D0;bpid=3DBP-1489452897-10.0.3.253-1395650301038 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume - /home/ubuntu/dallaybatta-data/hdfs/datanode/current 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 1395674259100 with interval 21600000 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool BP-1489452897-10.0.3.253-1395650301038 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubuntu/dallaybatta-data/hdfs/datanode/current... 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool BP-1489452897-10.0.3.253-1395650301038 on /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for block pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubuntu/dallaybatta-data/hdfs/datanode/current... 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas to map: 1ms 14/03/24 09:31:00 INFO datanode.DataNode: Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000 org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protoco= l.DisallowedDatanodeException): Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid= =3D1367523242;c=3D0) at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatan= ode(DatanodeManager.java:739) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSName= system.java:3929) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(N= ameNodeRpcServer.java:948) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.re= gisterDatanode(DatanodeProtocolServerSideTranslatorPB.java:90) at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtoc= olService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Prot= obufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) at org.apache.hadoop.ipc.Client.call(Client.java:1347) at org.apache.hadoop.ipc.Client.call(Client.java:1300) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.ja= va:206) at $Proxy9.registerDatanode(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:5= 7) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocat= ionHandler.java:186) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHan= dler.java:102) at $Proxy9.registerDatanode(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.re= gisterDatanode(DatanodeProtocolClientSideTranslatorPB.java:146) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceAct= or.java:623) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandsha= ke(BPServiceActor.java:225) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.ja= va:664) at java.lang.Thread.run(Thread.java:722) 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for: Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1395650455122) 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed bpid=3DBP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool BP-1489452897-10.0.3.253-1395650301038 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG: /************************************************************ *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148 * ************************************************************/ ***************************************************************************= ************* And here is the corresponding error coming at NameNode( 10.0.3.201) ***************************************************************************= ************* 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved datanode registration from 10.0.3.201 14/03/24 09:31:00 ERROR security.UserGroupInformation: PriviledgedActionException as:ubuntu (auth:SIMPLE) cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid= =3D1367523242;c=3D0) 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode from 10.0.3.201:60951 Call#1 Retry#0: error: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid= =3D1367523242;c=3D0) org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid= =3D1367523242;c=3D0) at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatan= ode(DatanodeManager.java:739) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSName= system.java:3929) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(N= ameNodeRpcServer.java:948) at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.re= gisterDatanode(DatanodeProtocolServerSideTranslatorPB.java:90) at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtoc= olService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Prot= obufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) ***************************************************************************= ************* I don't know from where *10.0.3.148 * ip is coming yet, could be due to some lxc configurations. What can be interpreted from the hadoop error information? Let me know if you need more info about my environment to provide some insights. Regards, Vicky --20cf303ea52045073804f557567b Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi All,
I am able to start the Namenode on 10.0.3.200 but when i try to s= tart the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
=
***********************************************************************= *****************
$ hdfs datanode
14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG: <= br>/************************************************************
STARTUP= _MSG: Starting DataNode
STARTUP_MSG:=A0=A0 host =3D Hadoop2/
10.0.3.148
STARTUP_MSG:=A0=A0 args =3D []
STARTUP_MSG:=A0=A0 version =3D 2.2.0
S= TARTUP_MSG:=A0=A0 classpath =3D /home/ubuntu/Installed/hadoop-2.2.0/etc/had= oop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api= -2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/common= s-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/co= mmons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/co= mmon/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/ha= doop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubun= tu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ub= untu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2= .1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-= codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/p= rotobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/com= mon/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hado= op-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed= /hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Install= ed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/= Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpc= lient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/x= z-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commo= ns-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/com= mon/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/sha= re/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/In= stalled/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-= 5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/nett= y-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/l= ib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/= junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib= /commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/commo= n/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/comm= on/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu= /Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.ja= r:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.= 2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-d= igester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib= /jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/I= nstalled/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-= 2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.= 5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-co= re-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/common= s-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/l= ib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/h= dfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/s= hare/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.= 2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/= hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Insta= lled/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubunt= u/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/hom= e/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar= :/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.= 1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.= jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mappe= r-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/p= rotobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yar= n/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ya= rn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/y= arn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn= /lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/= yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yar= n/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yar= n/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/y= arn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/h= adoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2= .0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installe= d/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubunt= u/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications= -unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/= hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.= 2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-s= erver-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn= /hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.j= ar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2= .2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-= server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/ma= preduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share= /hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.= 0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop= -2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubu= ntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/= ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl= -1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/p= rotobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/map= reduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hado= op/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/sh= are/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0= /share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hado= op-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Instal= led/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Inst= alled/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/u= buntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:= /home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-= 1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/= commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop= /mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2= .2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Ins= talled/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.= jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapre= duce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/m= apreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoo= p-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/ho= me/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-cl= ient-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/map= reduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/h= adoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/= home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-= client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoo= p/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/= Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
STARTUP_MSG:=A0=A0 build =3D https://svn.apache.org/repos/asf/hadoop/comm= on -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
S= TARTUP_MSG:=A0=A0 java =3D 1.7.0
************************************************************/
14/03/24 0= 9:30:57 INFO datanode.DataNode: registered UNIX signal handlers for [TERM, = HUP, INT]
14/03/24 09:30:57 WARN common.Util: Path /home/ubuntu/dallayba= tta-data/hdfs/datanode should be specified as a URI in configuration files.= Please update hdfs configuration.
14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from hadoop-me= trics2.properties
14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Schedul= ed snapshot period at 10 second(s).
14/03/24 09:30:58 INFO impl.MetricsS= ystemImpl: DataNode metrics system started
14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop214/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /0.0.0.0:50010
14/03/2= 4 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576 bytes/s 14/03/24 09:30:58 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAd= apter(org.mortbay.log) via org.mortbay.log.Slf4jLog
14/03/24 09:30:58 IN= FO http.HttpServer: Added global filter 'safety' (class=3Dorg.apach= e.hadoop.http.HttpServer$QuotingInputFilter)
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (cl= ass=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to c= ontext datanode
14/03/24 09:30:58 INFO http.HttpServer: Added filter sta= tic_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$Sta= ticUserFilter) to context logs
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter (cl= ass=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to c= ontext static
14/03/24 09:30:58 INFO datanode.DataNode: Opened info serv= er at localhost:50075
14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled =3D false
= 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
14/03/= 24 09:30:58 INFO mortbay.log: jetty-6.1.26
14/03/24 09:30:59 INFO mortba= y.log: Started SelectChannelConnector@localhost:50075
14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020=
14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /0.0.0.0:50020
14/03/24 0= 9:30:59 INFO datanode.DataNode: Refresh request received for nameservices: = null
14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for name= services: <default>
14/03/24 09:30:59 WARN common.Util: Path /home= /ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in conf= iguration files. Please update hdfs configuration.
14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering> (s= torage id unknown) service to /10.0.3.200:9000 starting to offer service
14/03/24 09:30:59= INFO ipc.Server: IPC Server Responder: starting
14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting14/03/24 09:30:59 INFO common.Storage: Lock on /home/ubuntu/dallaybatta-d= ata/hdfs/datanode/in_use.lock acquired by nodename 2618@Hadoop2
14/03/24= 09:31:00 INFO common.Storage: Locking is disabled
14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage: nsid=3D136752= 3242;bpid=3DBP-1489452897-10.0.3.253-1395650301038;lv=3D-47;nsInfo=3Dlv=3D-= 47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=3D1367523242;c=3D0;b= pid=3DBP-1489452897-10.0.3.253-1395650301038
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume - /home/ubuntu/dall= aybatta-data/hdfs/datanode/current
14/03/24 09:31:00 INFO impl.FsDataset= Impl: Registered FSDatasetState MBean
14/03/24 09:31:00 INFO datanode.Di= rectoryScanner: Periodic Directory Tree Verification scan starting at 13956= 74259100 with interval 21600000
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool BP-1489452897-= 10.0.3.253-1395650301038
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scan= ning block pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubun= tu/dallaybatta-data/hdfs/datanode/current...
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool BP= -1489452897-10.0.3.253-1395650301038 on /home/ubuntu/dallaybatta-data/hdfs/= datanode/current: 11ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total = time to scan all replicas for block pool BP-1489452897-10.0.3.253-139565030= 1038: 13ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for block= pool BP-1489452897-10.0.3.253-1395650301038 on volume /home/ubuntu/dallayb= atta-data/hdfs/datanode/current...
14/03/24 09:31:00 INFO impl.FsDataset= Impl: Time to add replicas to map for block pool BP-1489452897-10.0.3.253-1= 395650301038 on volume /home/ubuntu/dallaybatta-data/hdfs/datanode/current:= 0ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas t= o map: 1ms
14/03/24 09:31:00 INFO datanode.DataNode: Block pool BP-14894= 52897-10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1= 395650455122) service to /10.0.3.200:9000 beginning handshake with NN
14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block = pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-13807= 95562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protoco= l.DisallowedDatanodeException): Datanode denied communication with namenode= : DatanodeRegistration(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-= 1395650455122, infoPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;ci= d=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=3D1367523242;c=3D0)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.= registerDatanode(DatanodeManager.java:739)
=A0=A0=A0 at org.apache.hadoo= p.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929= )
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.= registerDatanode(NameNodeRpcServer.java:948)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTr= anslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)=
=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProt= os$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java= :24079)
=A0=A0=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker.call(ProtobufRpcEngine.java:585)
=A0=A0=A0 at org.apache.hadoop.ipc.= RPC$Server.call(RPC.java:928)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:2048)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)=A0=A0=A0 at java.security.AccessController.doPrivileged(Native Method)=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=A0=A0= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1491)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
=
=A0=A0=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1347)
=A0= =A0=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1300)
=A0=A0=A0 = at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine= .java:206)
=A0=A0=A0 at $Proxy9.registerDatanode(Unknown Source)
=A0=A0=A0 at sun.r= eflect.NativeMethodAccessorImpl.invoke0(Native Method)
=A0=A0=A0 at sun.= reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)=A0=A0=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)
=A0=A0=A0 at java.lang.reflect.Method.invoke(Method.java:601)
=A0=A0=A0 = at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvo= cationHandler.java:186)
=A0=A0=A0 at org.apache.hadoop.io.retry.RetryInv= ocationHandler.invoke(RetryInvocationHandler.java:102)
=A0=A0=A0 at $Proxy9.registerDatanode(Unknown Source)
=A0=A0=A0 at org.a= pache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registe= rDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
=A0=A0=A0 at = org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceAct= or.java:623)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectT= oNNAndHandshake(BPServiceActor.java:225)
=A0=A0=A0 at org.apache.hadoop.= hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
=A0=A0= =A0 at java.lang.Thread.run(Thread.java:722)
14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for: Bl= ock pool BP-1489452897-10.0.3.253-1395650301038 (storage id DS-1380795562-1= 0.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool BP-1489452897-= 10.0.3.253-1395650301038 (storage id DS-1380795562-10.0.3.201-50010-1395650= 455122)
14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed bpid= =3DBP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool BP-148945289= 7-10.0.3.253-1395650301038
14/03/24 09:31:02 WARN datanode.DataNode: Exi= ting Datanode
14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status = 0
14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/*****************************************= *******************
SHUTDOWN_MSG: Shutting dow= n DataNode at Hadoop2/10.0.= 3.148
***************************************************= *********/

***************************************************************************= *************


And here is the corresponding error coming a= t NameNode( 10.0.3.201)

********************************************= ********************************************
14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved datanode= registration from 10.0.3.201
14/03/24 09:31:00 ERROR security.UserGroup= Information: PriviledgedActionException as:ubuntu (auth:SIMPLE) cause:org.a= pache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode den= ied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=3D= DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPort=3D5= 0020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce= ;nsid=3D1367523242;c=3D0)
14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call org.a= pache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode from 10.0.3.201:60951 Ca= ll#1 Retry#0: error: org.apache.hadoop.hdfs.server.protocol.DisallowedDatan= odeException: Datanode denied communication with namenode: DatanodeRegistra= tion(0.0.0.0, storageID=3DDS-1380795562-10.0.3.201-50010-1395650455122, inf= oPort=3D50075, ipcPort=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-e= beb-4d52-9ead-4e65f49246ce;nsid=3D1367523242;c=3D0)
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanod= e denied communication with namenode: DatanodeRegistration(0.0.0.0, storage= ID=3DDS-1380795562-10.0.3.201-50010-1395650455122, infoPort=3D50075, ipcPor= t=3D50020, storageInfo=3Dlv=3D-47;cid=3DCID-b9e031fa-ebeb-4d52-9ead-4e65f49= 246ce;nsid=3D1367523242;c=3D0)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.= registerDatanode(DatanodeManager.java:739)
=A0=A0=A0 at org.apache.hadoo= p.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929= )
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.= registerDatanode(NameNodeRpcServer.java:948)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTr= anslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)=
=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProt= os$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java= :24079)
=A0=A0=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker.call(ProtobufRpcEngine.java:585)
=A0=A0=A0 at org.apache.hadoop.ipc.= RPC$Server.call(RPC.java:928)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:2048)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)=A0=A0=A0 at java.security.AccessController.doPrivileged(Native Method)=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=A0=A0= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1491)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
= ***************************************************************************= *************

I don't know from where 10.0.3.148
=A0ip is coming yet, could be due to some lxc configurations. What can be i= nterpreted from the hadoop error information?

Let me know= if you need more info about my environment to provide some insights.

Regards,
Vicky



--20cf303ea52045073804f557567b--