Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0AF03DB5D for ; Sun, 17 Mar 2013 17:58:39 +0000 (UTC) Received: (qmail 27085 invoked by uid 500); 17 Mar 2013 17:58:34 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 26849 invoked by uid 500); 17 Mar 2013 17:58:33 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 26837 invoked by uid 99); 17 Mar 2013 17:58:33 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 17 Mar 2013 17:58:33 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_FONT_SIZE_LARGE,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of dwivedishashwat@gmail.com designates 209.85.217.175 as permitted sender) Received: from [209.85.217.175] (HELO mail-lb0-f175.google.com) (209.85.217.175) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 17 Mar 2013 17:58:29 +0000 Received: by mail-lb0-f175.google.com with SMTP id n3so3979307lbo.6 for ; Sun, 17 Mar 2013 10:58:08 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=h6nt7IqI5AoqAAUY5XZfgnUyu4brnLosQjGjUsrOzxE=; b=M5JJEitFj0JAvlZan/mqWyNZb00gb6iIkQLJbIRCMbdn4tKOOQd7DntkM/B8UV1e0r Af+3YSL8MT/S8U7Yfjx+QNVc8x4tza8XYup0uwfX/z1oCAeniqckTPUXNPD3GUI0BPlS pjBQa66QsR076cp4LMZZstwH3Nw76Y72A6GnfVs4KD3F16c+DbE5Ic9mpmtXwAymXmdn XNbul7N1rKCPqyVYsJgsyzNFodsNpI2dvlWHZiae1BLiVhsOYSoinMexx4lAn5oZDgMp 4yZ/HvFOTlSwlCSOcf+eeQahODx9+P801N9+NbkTzoXx/Gi17K6v2EjTlDmMLnUq+N4V Kg6w== X-Received: by 10.152.128.98 with SMTP id nn2mr12008122lab.17.1363543087818; Sun, 17 Mar 2013 10:58:07 -0700 (PDT) MIME-Version: 1.0 Received: by 10.114.82.197 with HTTP; Sun, 17 Mar 2013 10:57:47 -0700 (PDT) In-Reply-To: References: From: shashwat shriparv Date: Sun, 17 Mar 2013 23:27:47 +0530 Message-ID: Subject: Re: Hadoop Debian Package To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d042c642362afca04d8229d13 X-Virus-Checked: Checked by ClamAV on apache.org --f46d042c642362afca04d8229d13 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Ye its is asking for file:/// instead of hdfs:// just check if it is taking setting configuration from other location... =E2=88=9E Shashwat Shriparv On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna wro= te: > Hi, > > What is the version of Hadoop you use? > > Try using fs.defaultFS instead of fs.default.name (see the list of all > the deprecated properties here: > > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/D= eprecatedProperties.html > ). > I remember I once had a similar error message and it was due to the > change in properties names. > > Regards, > > Sourygna > > On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani > wrote: > > Hi to all users of Hadoop, > > > > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not > > configure it right. The conf dir is under templates in /usr/shar/hadoop= . > I > > edit the core-site.xml, mapred-site.xml files to give > > > > fs.default.name > > hdfs://localhost:9000 > > > > and for mapred > > > > mapred.job.tracker > > localhost:9001 > > > > > > but i get these errors, I assume that there is problem, Hadoop cannot > read > > the configuration file. > > I chaned the hadoop-env.sh to > > export > HADOOP_CONF_DIR=3D${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"} > > but dosen't solve the problem. > > > > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: > > java.lang.IllegalArgumentException: Does not contain a valid host:port > > authority: file:/// at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:= 201) > > at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:= 231) > > at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNod= e.java:225) > > at > > > org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.ja= va:347) > > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) = at > > > org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.jav= a:1651) > > at > > > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataN= ode.java:1590) > > at > > > org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.j= ava:1608) > > at > > > org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:= 1734) > > at > org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751) > > > > ________________________________ > > > > FATAL org.apache.hadoop.mapred.JobTracker: > > java.lang.IllegalArgumentException: Does not contain a valid host:port > > authority: local at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at > > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at > > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at > > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at > > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at > > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) a= t > > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) a= t > > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298) > > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791) > > > > ________________________________ > > > > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: > > java.lang.IllegalArgumentException: Does not contain a valid host:port > > authority: file:/// at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:= 201) > > at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:= 231) > > at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:= 265) > > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536) = at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.j= ava:1410) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419) > > > > ________________________________ > > > > Exception in thread "main" java.lang.IllegalArgumentException: Does not > > contain a valid host:port authority: file:/// at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:= 201) > > at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:= 231) > > at > > > org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNod= e.java:225) > > at > > > org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Secon= daryNameNode.java:167) > > at > > > org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNo= de.java:135) > > at > > > org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNa= meNode.java:650) > > > > ________________________________ > > > > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker > > because java.lang.IllegalArgumentException: Does not contain a valid > > host:port authority: local at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at > > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at > > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at > > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at > > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906) > > > > > > Regards, > > Mohammad Alkahtani > --f46d042c642362afca04d8229d13 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Ye its is asking for file:/// instead of hdfs:// just chec= k if it is taking setting configuration from other location...

=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=20 =09 =09 =09 =09

=E2=88=9E

Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangs= ay Sourygna <luangsay@gmail.com> wrote:
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.o= rg/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html= ).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m.alkahtani@gmail.com> = wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not=
> configure it right. The conf dir is under templates in /usr/shar/hadoo= p. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.de= fault.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot = read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=3D${HADOOP_CONF_DIR:-"/usr/shar/hadoop/tem= plates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(Name= Node.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode= .java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)= at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.= java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(Da= taNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNod= e.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.ja= va:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:= 1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at<= br> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) a= t
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) = at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) = at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)<= br> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja= va:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)= at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod= e.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:= 1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentExceptio= n: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(Name= Node.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Se= condaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNam= eNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(Secondar= yNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker=
> because java.lang.IllegalArgumentException: Does not contain a valid > host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at<= br> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) a= t
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani

--f46d042c642362afca04d8229d13--