Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8F8CED226 for ; Sun, 17 Mar 2013 20:31:50 +0000 (UTC) Received: (qmail 58915 invoked by uid 500); 17 Mar 2013 20:31:45 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 58797 invoked by uid 500); 17 Mar 2013 20:31:45 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 58790 invoked by uid 99); 17 Mar 2013 20:31:45 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 17 Mar 2013 20:31:45 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_FONT_SIZE_LARGE,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dontariq@gmail.com designates 209.85.128.173 as permitted sender) Received: from [209.85.128.173] (HELO mail-ve0-f173.google.com) (209.85.128.173) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 17 Mar 2013 20:31:38 +0000 Received: by mail-ve0-f173.google.com with SMTP id oz10so3941093veb.32 for ; Sun, 17 Mar 2013 13:31:17 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=utosEMHDv15FSfbawGIw9PEMpJAjQJQyBbdP+d9capQ=; b=IkFLfLEOzPv9lXaTL19jJKWTwrLIu7hHwyV42Q5oCSUe8J1I/LJu8NLSnTrIxIP6yz tX0K2CG0AKdCO3PNvKMnfJot4NPt/2xoDJUsmRUFHsbRoGPymuPi6J+HV35ipr64ylIE FjM8hSGl9in2MNZ6wxl4n5oEcVEx6tS6dWBPmlClamxH5vleHPR3KDJXBdUx/Q1VFYDm tDwvv38T6wzR5e4aWrJgUwS7TYIXNFVrtHybKXNpDWH76dgO5UqNth+TaaCo6EInixxw IHXorWyazpzmtFaVUIx/MsNdLPuU3NU8qYEFBM48fFtH0k32lIIIXjc3VqvMeA/8XZwJ DIuA== X-Received: by 10.52.176.138 with SMTP id ci10mr14807829vdc.35.1363552277310; Sun, 17 Mar 2013 13:31:17 -0700 (PDT) MIME-Version: 1.0 Received: by 10.59.13.9 with HTTP; Sun, 17 Mar 2013 13:30:37 -0700 (PDT) In-Reply-To: References: From: Mohammad Tariq Date: Mon, 18 Mar 2013 02:00:37 +0530 Message-ID: Subject: Re: Hadoop Debian Package To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=bcaec517a8be1f68ae04d824c104 X-Virus-Checked: Checked by ClamAV on apache.org --bcaec517a8be1f68ae04d824c104 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable good to hear that :) Warm Regards, Tariq https://mtariq.jux.com/ cloudfront.blogspot.com On Mon, Mar 18, 2013 at 1:58 AM, Mohammad Alkahtani wrote: > Thank you Tariq, I removed the .deb > and download the source file hadoop-1.0.4.tar.gz > and worked very well. > > Thank you again > > Mohammad Alkahtani > P.O.Box 102275 > Riyadh 11675 > Saudi Arabia > mobile: 00966 555 33 1717 > > > On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq wrot= e: > >> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do >> you proper permission to read these files? >> >> Warm Regards, >> Tariq >> https://mtariq.jux.com/ >> cloudfront.blogspot.com >> >> >> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani < >> m.alkahtani@gmail.com> wrote: >> >>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set >>> the Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop a= lso >>> I got the error. >>> >>> Mohammad Alkahtani >>> P.O.Box 102275 >>> Riyadh 11675 >>> Saudi Arabia >>> mobile: 00966 555 33 1717 >>> >>> >>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq wr= ote: >>> >>>> set these properties in the configuration files present in your /etc >>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin >>>> directory that holds the Hadoop scripts. so, set that accordingly in >>>> .bashrc file. >>>> >>>> Warm Regards, >>>> Tariq >>>> https://mtariq.jux.com/ >>>> cloudfront.blogspot.com >>>> >>>> >>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani < >>>> m.alkahtani@gmail.com> wrote: >>>> >>>>> Thank you Mohammad Tariq >>>>> >>>>> Mohammad Alkahtani >>>>> P.O.Box 102275 >>>>> Riyadh 11675 >>>>> Saudi Arabia >>>>> mobile: 00966 555 33 1717 >>>>> >>>>> >>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani < >>>>> m.alkahtani@gmail.com> wrote: >>>>> >>>>>> I tried all of the hadoop home dirs but didn't worke >>>>>> >>>>>> Mohammad Alkahtani >>>>>> P.O.Box 102275 >>>>>> Riyadh 11675 >>>>>> Saudi Arabia >>>>>> mobile: 00966 555 33 1717 >>>>>> >>>>>> >>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani < >>>>>> m.alkahtani@gmail.com> wrote: >>>>>> >>>>>>> OK what the Hadoop home should be in ubuntu because the binary file= s >>>>>>> in /usr/bin >>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop >>>>>>> the conf files in /usr/share/hadoop/templates/conf >>>>>>> >>>>>>> shall I use /usr as hadoop path because it is the dir that contain >>>>>>> the bin files >>>>>>> >>>>>>> Mohammad Alkahtani >>>>>>> P.O.Box 102275 >>>>>>> Riyadh 11675 >>>>>>> Saudi Arabia >>>>>>> mobile: 00966 555 33 1717 >>>>>>> >>>>>>> >>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq wrote: >>>>>>> >>>>>>>> log out from the user. log in again and see if it works. >>>>>>>> >>>>>>>> Warm Regards, >>>>>>>> Tariq >>>>>>>> https://mtariq.jux.com/ >>>>>>>> cloudfront.blogspot.com >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq < >>>>>>>> dontariq@gmail.com> wrote: >>>>>>>> >>>>>>>>> you can avoid the warning by setting the following prop to true i= n >>>>>>>>> the hadoop-env.sh file : >>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=3Dtrue >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> Warm Regards, >>>>>>>>> Tariq >>>>>>>>> https://mtariq.jux.com/ >>>>>>>>> cloudfront.blogspot.com >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani < >>>>>>>>> m.alkahtani@gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Thank you Mohammad >>>>>>>>>> I still get the same error with this msg >>>>>>>>>> >>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated. >>>>>>>>>> I searched ~/.bashrc but only what I wrote is there. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Mohammad Alkahtani >>>>>>>>>> P.O.Box 102275 >>>>>>>>>> Riyadh 11675 >>>>>>>>>> Saudi Arabia >>>>>>>>>> mobile: 00966 555 33 1717 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq < >>>>>>>>>> dontariq@gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> you can do that using these command : >>>>>>>>>>> >>>>>>>>>>> sudo gedit ~/.bashrc >>>>>>>>>>> >>>>>>>>>>> then go to the end of the file and add this line : >>>>>>>>>>> export HADOOP_HOME=3D/YOUR_FULL_HADOOP_PATH >>>>>>>>>>> >>>>>>>>>>> after that use it to freeze the changes : >>>>>>>>>>> source ~/.bashrc >>>>>>>>>>> >>>>>>>>>>> to check it : >>>>>>>>>>> echo $HADOOP_HOME >>>>>>>>>>> >>>>>>>>>>> This will permanently set your HADOOP_HOME. >>>>>>>>>>> >>>>>>>>>>> HTH >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Warm Regards, >>>>>>>>>>> Tariq >>>>>>>>>>> https://mtariq.jux.com/ >>>>>>>>>>> cloudfront.blogspot.com >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani < >>>>>>>>>>> m.alkahtani@gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi Tariq, Could you please tell me how to set HADOOP_HOME >>>>>>>>>>>> because I don't find it in the hadoop-env.sh >>>>>>>>>>>> >>>>>>>>>>>> Thank you Shashwat >>>>>>>>>>>> this is the output and it is already configured but hadoopdon'= t read the configuration from here. >>>>>>>>>>>> >>>>>>>>>>>> /usr/share/maven-repo/org/apache >>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml >>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian >>>>>>>>>>>> /commons-parent-debian-site.xml >>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml >>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian >>>>>>>>>>>> -site.xml >>>>>>>>>>>> /usr/share/compiz/composite.xml >>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml >>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml >>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml >>>>>>>>>>>> >>>>>>>>>>>> Mohammad Alkahtani >>>>>>>>>>>> P.O.Box 102275 >>>>>>>>>>>> Riyadh 11675 >>>>>>>>>>>> Saudi Arabia >>>>>>>>>>>> mobile: 00966 555 33 1717 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv < >>>>>>>>>>>> dwivedishashwat@gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> try >>>>>>>>>>>>> find / -type f -iname "*site.xml" >>>>>>>>>>>>> it will show you where ever those files are.. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> =E2=88=9E >>>>>>>>>>>>> Shashwat Shriparv >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani < >>>>>>>>>>>>> m.alkahtani@gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> The problem is I tried I read the configuration file by >>>>>>>>>>>>>> changing >>>>>>>>>>>>>> export HADOOP_CONF_DIR=3D${HADOOP_CONF_ >>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"} >>>>>>>>>>>>>> but I think Hadoop dosen't get the configration from this >>>>>>>>>>>>>> dir, I trid and searched the system for conf dir the only di= r is this one >>>>>>>>>>>>>> which I changed. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Mohammad Alkahtani >>>>>>>>>>>>>> P.O.Box 102275 >>>>>>>>>>>>>> Riyadh 11675 >>>>>>>>>>>>>> Saudi Arabia >>>>>>>>>>>>>> mobile: 00966 555 33 1717 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv < >>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ye its is asking for file:/// instead of hdfs:// just check >>>>>>>>>>>>>>> if it is taking setting configuration from other location..= . >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> =E2=88=9E >>>>>>>>>>>>>>> Shashwat Shriparv >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna < >>>>>>>>>>>>>>> luangsay@gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> What is the version of Hadoop you use? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Try using fs.defaultFS instead of fs.default.name (see the >>>>>>>>>>>>>>>> list of all >>>>>>>>>>>>>>>> the deprecated properties here: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/= hadoop-common/DeprecatedProperties.html >>>>>>>>>>>>>>>> ). >>>>>>>>>>>>>>>> I remember I once had a similar error message and it was >>>>>>>>>>>>>>>> due to the >>>>>>>>>>>>>>>> change in properties names. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Regards, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Sourygna >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> > Hi to all users of Hadoop, >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I >>>>>>>>>>>>>>>> might could not >>>>>>>>>>>>>>>> > configure it right. The conf dir is under templates in >>>>>>>>>>>>>>>> /usr/shar/hadoop. I >>>>>>>>>>>>>>>> > edit the core-site.xml, mapred-site.xml files to give >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > fs.default.name >>>>>>>>>>>>>>>> > hdfs://localhost:9000 >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > and for mapred >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > mapred.job.tracker >>>>>>>>>>>>>>>> > localhost:9001 >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > but i get these errors, I assume that there is problem, >>>>>>>>>>>>>>>> Hadoop cannot read >>>>>>>>>>>>>>>> > the configuration file. >>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh to >>>>>>>>>>>>>>>> > export >>>>>>>>>>>>>>>> HADOOP_CONF_DIR=3D${HADOOP_CONF_DIR:-"/usr/shar/hadoop/tem= plates/conf"} >>>>>>>>>>>>>>>> > but dosen't solve the problem. >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: >>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a >>>>>>>>>>>>>>>> valid host:port >>>>>>>>>>>>>>>> > authority: file:/// at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:164) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress= (NameNode.java:201) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress= (NameNode.java:231) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getService= Address(NameNode.java:225) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataN= ode(DataNode.java:347) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.= java:309) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstan= ce(DataNode.java:1651) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiat= eDataNode(DataNode.java:1590) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createData= Node(DataNode.java:1608) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain= (DataNode.java:1734) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataN= ode.java:1751) >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ________________________________ >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker: >>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a >>>>>>>>>>>>>>>> valid host:port >>>>>>>>>>>>>>>> > authority: local at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:164) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:130) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.= java:2312) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)= at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)= at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)= at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracke= r.java:312) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracke= r.java:303) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracke= r.java:298) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4= 791) >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ________________________________ >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: >>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException: Does not contain a >>>>>>>>>>>>>>>> valid host:port >>>>>>>>>>>>>>>> > authority: file:/// at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:164) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress= (NameNode.java:201) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress= (NameNode.java:231) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize= (NameNode.java:265) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.= java:536) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createName= Node(NameNode.java:1410) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameN= ode.java:1419) >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ________________________________ >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > Exception in thread "main" >>>>>>>>>>>>>>>> java.lang.IllegalArgumentException: Does not >>>>>>>>>>>>>>>> > contain a valid host:port authority: file:/// at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:164) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress= (NameNode.java:201) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress= (NameNode.java:231) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getService= Address(NameNode.java:225) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.i= nitialize(SecondaryNameNode.java:167) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(= SecondaryNameNode.java:135) >>>>>>>>>>>>>>>> > at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.m= ain(SecondaryNameNode.java:650) >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ________________________________ >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker: Can not star= t >>>>>>>>>>>>>>>> task tracker >>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException: Does not >>>>>>>>>>>>>>>> contain a valid >>>>>>>>>>>>>>>> > host:port authority: local at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:164) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.j= ava:130) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.= java:2312) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:153= 2) at >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java= :3906) >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > >>>>>>>>>>>>>>>> > Regards, >>>>>>>>>>>>>>>> > Mohammad Alkahtani >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > --bcaec517a8be1f68ae04d824c104 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
good to hear that :)



On Mon, Mar 18, 2013 at 1:58 AM, Mohamma= d Alkahtani <m.alkahtani@gmail.com> wrote:
Thank you Tariq, I removed the .deb
=C2=A0and download the source file hadoo= p-1.0.4.tar.gz
and worked very well.

Thank you again

Mohammad Alkahtani
= P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717=


On Sun, Mar= 17, 2013 at 11:07 PM, Mohammad Tariq <dontariq@gmail.com> = wrote:
you have to use upper case 'HADOOP_HOME' (don'= t mind if it's a typo). do you proper permission to read these files?
On Mon, Mar 18, 2013 at = 1:00 AM, Mohammad Alkahtani <m.alkahtani@gmail.com> wrot= e:
The files that in hadoop-x.x.x/bin in the /usr/bin di, I t= ried to set the Hadoop_Home to /usr but still get the errors, I tried /etc/= hadoop also I got the error.

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabiamobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at = 10:15 PM, Mohammad Tariq <dontariq@gmail.com> wrote:
set these properties in the configuration files present in= your /etc directory. and HADOOP_HOME is the parent directory of the hadoop= bin directory that holds the Hadoop scripts. so, set that accordingly in .= bashrc file.
On Mon, Mar 18, 2013 at = 12:35 AM, Mohammad Alkahtani <m.alkahtani@gmail.com> wro= te:
Thank you Mohammad Tariq

Mohammad Alkahtani
P.O.Box 102275
Riya= dh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at = 10:04 PM, Mohammad Alkahtani <m.alkahtani@gmail.com> wro= te:
I tried all of the hadoop home dirs but didn't worke

Mohammad Alk= ahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: = 00966 555 33 1717


On Sun, Mar 17, 2013 at = 9:57 PM, Mohammad Alkahtani <m.alkahtani@gmail.com> wrot= e:
OK what the Hadoop home should be in ubuntu= because the binary files in /usr/bin
the hadoop-env.sh and othe x= ml file in /etc/hadoop
the conf files in /usr/share/hadoop/templat= es/conf

shall I use /usr as hadoop path because it is the dir that contai= n the bin files

=
Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at = 9:50 PM, Mohammad Tariq <dontariq@gmail.com> wrote:
log out from the user. log in again and see if it works.
On Mon, Mar 18, 2013 at = 12:18 AM, Mohammad Tariq <dontariq@gmail.com> wrote:
you can avoid the warning by setting the following prop to= true in the hadoop-env.sh file :
export HADOOP_HOME_WARN_SUPPRESS=3Dtr= ue


On Mon, Mar 18, 2013 at = 12:07 AM, Mohammad Alkahtani <m.alkahtani@gmail.com> wro= te:
Thank you Mohammad
I still get the sam= e error with this msg

localhost: Warning: $HADOOP_HOME is deprecated= .
I searched ~/.bashrc but only what I wrote is there.


Mohammad= Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at = 9:21 PM, Mohammad Tariq <dontariq@gmail.com> wrote:
you can do that using these command :

s= udo gedit ~/.bashrc

then go to the end of the file= and add this line :
export HADOOP_HOME=3D/YOUR_FULL_HADOOP_PATH<= /div>

after that use it to freeze the changes :
source ~/.bashrc

to check it :
echo $H= ADOOP_HOME

This will permanently set your HADOOP_H= OME.

HTH

On Sun, Mar 17, 2013 at = 11:46 PM, Mohammad Alkahtani <m.alkahtani@gmail.com> wro= te:
Hi Tariq, Could you please tell me = how to set HADOOP_HOME because I don't find it in the hadoop-env.sh

Thank you Shashwat
this is the output and = it is already configured but hadoop don't read the configu= ration from here.

/usr/share/maven-repo/or= g/apache/commons/commons-parent/22/commons-parent-22-site.xml
/usr/share/maven-repo/org/apache/com= mons/commons-parent/debian/commons-parent-debian-= site.xml
/usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
/usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
/usr/share/compiz/composite.xml
/= usr/share/hadoop/templates/conf/mapred
-site.xml
/usr/share/hadoop/templates/conf/cor= e-site.xml
/usr/share/hadoop/temp= lates/conf/hdfs-site.xml

Mohammad Alkah= tani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: = 00966 555 33 1717


On Sun, Mar 17, 2013 at = 9:07 PM, shashwat shriparv <dwivedishashwat@gmail.com> wrote:
try
find / -type f -iname "*site.= xml"
it will show you where ever those files are..
<= div class=3D"gmail_extra">
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=20 =09 =09 =09 =09

=E2=88=9E=

Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:34 PM, Mohamm= ad Alkahtani <m.alkahtani@gmail.com> wrote:
The problem is I tried I read the configuration file by ch= anging
export HADOOP_CONF_DIR=3D${HADOOP_CONF_
DIR:-&qu= ot;/usr/shar/hadoop/templates/conf"}
but I think Hadoop = dosen't get the configration from this dir, I trid and searched the sys= tem for conf dir the only dir is this one which I changed.

Mohammad Alkahtani
P.O.Box 102275
Riya= dh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 8:57 PM, shashwa= t shriparv <dwivedishashwat@gmail.com> wrote:
Ye its is asking for file:/// instead of hdfs:// just chec= k if it is taking setting configuration from other location...

=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=20 =09 =09 =09 =09

=E2=88=9E=

Shashwat Shriparv



On Sun, Mar 17, 2013 at 11:07 PM, Luangs= ay Sourygna <luangsay@gmail.com> wrote:
Hi,

What is the version of Hadoop you use?

Try using fs.defaultFS instead of fs.default.name (see the list of all
the deprecated properties here:
http://hadoop.apache.o= rg/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html= ).
I remember I once had a similar error message and it was due to the
change in properties names.

Regards,

Sourygna

On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
<m.alkahtani@= gmail.com> wrote:
> Hi to all users of Hadoop,
>
> I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not=
> configure it right. The conf dir is under templates in /usr/shar/hadoo= p. I
> edit the core-site.xml, mapred-site.xml files to give
> <property>
> <name>fs.de= fault.name</name>
> <value>hdfs://localhost:9000</value>
> </property>
> and for mapred
> <property>
> <name>mapred.job.tracker</name>
> <value>localhost:9001</value>
> </property>
>
> but i get these errors, I assume that there is problem, Hadoop cannot = read
> the configuration file.
> I chaned the hadoop-env.sh to
> export HADOOP_CONF_DIR=3D${HADOOP_CONF_DIR:-"/usr/shar/hadoop/tem= plates/conf"}
> but dosen't solve the problem.
>
> ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(Name= Node.java:225)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode= .java:347)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)= at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.= java:1651)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(Da= taNode.java:1590)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNod= e.java:1608)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.ja= va:1734)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:= 1751)
>
> ________________________________
>
> FATAL org.apache.hadoop.mapred.JobTracker:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at<= br> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) a= t
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) = at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) = at
> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)<= br> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>
> ________________________________
>
> ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> java.lang.IllegalArgumentException: Does not contain a valid host:port=
> authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja= va:265)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)= at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod= e.java:1410)
> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:= 1419)
>
> ________________________________
>
> Exception in thread "main" java.lang.IllegalArgumentExceptio= n: Does not
> contain a valid host:port authority: file:/// at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:201)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.ja= va:231)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(Name= Node.java:225)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Se= condaryNameNode.java:167)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNam= eNode.java:135)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(Secondar= yNameNode.java:650)
>
> ________________________________
>
> ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker=
> because java.lang.IllegalArgumentException: Does not contain a valid > host:port authority: local at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at<= br> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at<= br> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) a= t
> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532) at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>
>
> Regards,
> Mohammad Alkahtani
















--bcaec517a8be1f68ae04d824c104--