Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BB4A6DBE2 for ; Sun, 17 Mar 2013 18:08:40 +0000 (UTC) Received: (qmail 76293 invoked by uid 500); 17 Mar 2013 18:08:35 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 76193 invoked by uid 500); 17 Mar 2013 18:08:35 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 76186 invoked by uid 99); 17 Mar 2013 18:08:35 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 17 Mar 2013 18:08:35 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_FONT_SIZE_LARGE,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dwivedishashwat@gmail.com designates 209.85.217.176 as permitted sender) Received: from [209.85.217.176] (HELO mail-lb0-f176.google.com) (209.85.217.176) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 17 Mar 2013 18:08:29 +0000 Received: by mail-lb0-f176.google.com with SMTP id s4so4079380lbc.35 for ; Sun, 17 Mar 2013 11:08:09 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=UITaNx9g+58FndMWbBoPkFEmXvKmSWYboKXtaa0T+QM=; b=Zu18ziB77GDb0uX+85Bf4mOeB4g9H53AtJpBrVmGgVf1wpXKIb8QtoPv7HkajdAjEH FO0hTeW+zzDeqJXp6HMH25cx+4vfpUDyveZYaMuhEBDrvCxID7hAvVdK6Plh14saOeGb +qrdA6fyDQ0GGE4PGajilv94ygKl9F9eTpRwq7kg0XoM4w+pcC01sZMSvPGybBh8YGHc ogoOhw7vMXC7ZZqcH1FM0sm8qp5wGBX2a7+A4u1qhFdaxDTMhMQDB5R4V/osdpudYdTh 5sAtDp+C0FE2PNbgr/oJ91Pn2zc7QNBwPCgJQzKUwwJd+T4amgjxmyBdxKk+6njK4q29 8QYw== X-Received: by 10.112.26.202 with SMTP id n10mr5445304lbg.15.1363543688898; Sun, 17 Mar 2013 11:08:08 -0700 (PDT) MIME-Version: 1.0 Received: by 10.114.82.197 with HTTP; Sun, 17 Mar 2013 11:07:48 -0700 (PDT) In-Reply-To: References: From: shashwat shriparv Date: Sun, 17 Mar 2013 23:37:48 +0530 Message-ID: Subject: Re: Hadoop Debian Package To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec55555fe3673f604d822c165 X-Virus-Checked: Checked by ClamAV on apache.org --bcaec55555fe3673f604d822c165 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable try find / -type f -iname "*site.xml" it will show you where ever those files are.. =E2=88=9E Shashwat Shriparv On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani wrote: > The problem is I tried I read the configuration file by changing > export HADOOP_CONF_DIR=3D${HADOOP_CONF_ > DIR:-"/usr/shar/hadoop/templates/conf"} > but I think Hadoop dosen't get the configration from this dir, I trid and > searched the system for conf dir the only dir is this one which I changed= . > > Mohammad Alkahtani > P.O.Box 102275 > Riyadh 11675 > Saudi Arabia > mobile: 00966 555 33 1717 > > > On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv < > dwivedishashwat@gmail.com> wrote: > >> Ye its is asking for file:/// instead of hdfs:// just check if it is >> taking setting configuration from other location... >> >> >> >> =E2=88=9E >> Shashwat Shriparv >> >> >> >> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna = wrote: >> >>> Hi, >>> >>> What is the version of Hadoop you use? >>> >>> Try using fs.defaultFS instead of fs.default.name (see the list of all >>> the deprecated properties here: >>> >>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common= /DeprecatedProperties.html >>> ). >>> I remember I once had a similar error message and it was due to the >>> change in properties names. >>> >>> Regards, >>> >>> Sourygna >>> >>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani >>> wrote: >>> > Hi to all users of Hadoop, >>> > >>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could no= t >>> > configure it right. The conf dir is under templates in >>> /usr/shar/hadoop. I >>> > edit the core-site.xml, mapred-site.xml files to give >>> > >>> > fs.default.name >>> > hdfs://localhost:9000 >>> > >>> > and for mapred >>> > >>> > mapred.job.tracker >>> > localhost:9001 >>> > >>> > >>> > but i get these errors, I assume that there is problem, Hadoop cannot >>> read >>> > the configuration file. >>> > I chaned the hadoop-env.sh to >>> > export >>> HADOOP_CONF_DIR=3D${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"} >>> > but dosen't solve the problem. >>> > >>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: >>> > java.lang.IllegalArgumentException: Does not contain a valid host:por= t >>> > authority: file:/// at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.jav= a:201) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.jav= a:231) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameN= ode.java:225) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.= java:347) >>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309= ) >>> at >>> > >>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.j= ava:1651) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(Dat= aNode.java:1590) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode= .java:1608) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.jav= a:1734) >>> > at >>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751= ) >>> > >>> > ________________________________ >>> > >>> > FATAL org.apache.hadoop.mapred.JobTracker: >>> > java.lang.IllegalArgumentException: Does not contain a valid host:por= t >>> > authority: local at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at >>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) = at >>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at >>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at >>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at >>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) >>> at >>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) >>> at >>> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298) >>> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791) >>> > >>> > ________________________________ >>> > >>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: >>> > java.lang.IllegalArgumentException: Does not contain a valid host:por= t >>> > authority: file:/// at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.jav= a:201) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.jav= a:231) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.jav= a:265) >>> > at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536= ) >>> at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode= .java:1410) >>> > at >>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419= ) >>> > >>> > ________________________________ >>> > >>> > Exception in thread "main" java.lang.IllegalArgumentException: Does n= ot >>> > contain a valid host:port authority: file:/// at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.jav= a:201) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.jav= a:231) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameN= ode.java:225) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Sec= ondaryNameNode.java:167) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryName= Node.java:135) >>> > at >>> > >>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(Secondary= NameNode.java:650) >>> > >>> > ________________________________ >>> > >>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracke= r >>> > because java.lang.IllegalArgumentException: Does not contain a valid >>> > host:port authority: local at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at >>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at >>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) = at >>> > org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at >>> > org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906) >>> > >>> > >>> > Regards, >>> > Mohammad Alkahtani >>> >> >> > --bcaec55555fe3673f604d822c165 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
try
find / -type f -iname "*site.= xml"
it will show you where ever those files are..
<= div class=3D"gmail_extra">
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=20 =09 =09 =09 =09

=E2=88=9E

Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:34 PM, Mohamm= ad Alkahtani <m.alkahtani@gmail.com> wrote:
The problem is I tried I read the configuration file by ch= anging
export HADOOP_CONF_DIR=3D${HADOOP_CONF_
DIR:-&qu= ot;/usr/shar/hadoop/templates/conf"}
but I think Hadoop = dosen't get the configration from this dir, I trid and searched the sys= tem for conf dir the only dir is this one which I changed.

Mohammad Alkahtani
P.O.B= ox 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 17= 17


On Sun, Mar 17, 2013 at 8:57 PM, shashwa= t shriparv <dwivedishashwat@gmail.com> wrote:
Ye its is asking for file:/// instead of hdfs:// just chec= k if it is taking setting configuration from other location...

=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=20 =09 =09 =09 =09

=E2=88=9E=

Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangs= ay Sourygna <luangsay@gmail.com> wrote:
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.o= rg/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html= ).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m.alkahtani@= gmail.com> wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not=
> configure it right. The conf dir is under templates in /usr/shar/hadoo= p. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.de= fault.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot = read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=3D${HADOOP_CONF_DIR:-"/usr/shar/hadoop/tem= plates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(Name= Node.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode= .java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)= at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.= java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(Da= taNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNod= e.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.ja= va:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:= 1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at<= br> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) a= t
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) = at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) = at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)<= br> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja= va:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)= at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod= e.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:= 1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentExceptio= n: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(Name= Node.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Se= condaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNam= eNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(Secondar= yNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker=
> because java.lang.IllegalArgumentException: Does not contain a valid > host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at<= br> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) a= t
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani



--bcaec55555fe3673f604d822c165--