hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohammad Alkahtani <m.alkaht...@gmail.com>
Subject Re: Hadoop Debian Package
Date Sun, 17 Mar 2013 20:28:36 GMT
Thank you Tariq, I removed the .deb
 and download the source file
hadoop-1.0.4.tar.gz<http://mirrors.isu.net.sa/pub/apache/hadoop/common/stable/hadoop-1.0.4.tar.gz>
and worked very well.

Thank you again

Mohammad Alkahtani
P.O.Box 102275
Riyadh 11675
Saudi Arabia
mobile: 00966 555 33 1717


On Sun, Mar 17, 2013 at 11:07 PM, Mohammad Tariq <dontariq@gmail.com> wrote:

> you have to use upper case 'HADOOP_HOME' (don't mind if it's a typo). do
> you proper permission to read these files?
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Mon, Mar 18, 2013 at 1:00 AM, Mohammad Alkahtani <m.alkahtani@gmail.com
> > wrote:
>
>> The files that in hadoop-x.x.x/bin in the /usr/bin di, I tried to set the
>> Hadoop_Home to /usr but still get the errors, I tried /etc/hadoop also I
>> got the error.
>>
>> Mohammad Alkahtani
>> P.O.Box 102275
>> Riyadh 11675
>> Saudi Arabia
>> mobile: 00966 555 33 1717
>>
>>
>> On Sun, Mar 17, 2013 at 10:15 PM, Mohammad Tariq <dontariq@gmail.com>wrote:
>>
>>> set these properties in the configuration files present in your /etc
>>> directory. and HADOOP_HOME is the parent directory of the hadoop bin
>>> directory that holds the Hadoop scripts. so, set that accordingly in
>>> .bashrc file.
>>>
>>> Warm Regards,
>>> Tariq
>>> https://mtariq.jux.com/
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Mon, Mar 18, 2013 at 12:35 AM, Mohammad Alkahtani <
>>> m.alkahtani@gmail.com> wrote:
>>>
>>>> Thank you Mohammad Tariq
>>>>
>>>> Mohammad Alkahtani
>>>> P.O.Box 102275
>>>> Riyadh 11675
>>>> Saudi Arabia
>>>> mobile: 00966 555 33 1717
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 10:04 PM, Mohammad Alkahtani <
>>>> m.alkahtani@gmail.com> wrote:
>>>>
>>>>> I tried all of the hadoop home dirs but didn't worke
>>>>>
>>>>> Mohammad Alkahtani
>>>>> P.O.Box 102275
>>>>> Riyadh 11675
>>>>> Saudi Arabia
>>>>> mobile: 00966 555 33 1717
>>>>>
>>>>>
>>>>> On Sun, Mar 17, 2013 at 9:57 PM, Mohammad Alkahtani <
>>>>> m.alkahtani@gmail.com> wrote:
>>>>>
>>>>>> OK what the Hadoop home should be in ubuntu because the binary files
>>>>>> in /usr/bin
>>>>>> the hadoop-env.sh and othe xml file in /etc/hadoop
>>>>>> the conf files in /usr/share/hadoop/templates/conf
>>>>>>
>>>>>> shall I use /usr as hadoop path because it is the dir that contain
>>>>>> the bin files
>>>>>>
>>>>>> Mohammad Alkahtani
>>>>>> P.O.Box 102275
>>>>>> Riyadh 11675
>>>>>> Saudi Arabia
>>>>>> mobile: 00966 555 33 1717
>>>>>>
>>>>>>
>>>>>> On Sun, Mar 17, 2013 at 9:50 PM, Mohammad Tariq <dontariq@gmail.com>wrote:
>>>>>>
>>>>>>> log out from the user. log in again and see if it works.
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> https://mtariq.jux.com/
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Mar 18, 2013 at 12:18 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> you can avoid the warning by setting the following prop to
true in
>>>>>>>> the hadoop-env.sh file :
>>>>>>>> export HADOOP_HOME_WARN_SUPPRESS=true
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> https://mtariq.jux.com/
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Mar 18, 2013 at 12:07 AM, Mohammad Alkahtani <
>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Thank you Mohammad
>>>>>>>>> I still get the same error with this msg
>>>>>>>>>
>>>>>>>>> localhost: Warning: $HADOOP_HOME is deprecated.
>>>>>>>>> I searched ~/.bashrc but only what I wrote is there.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Mohammad Alkahtani
>>>>>>>>> P.O.Box 102275
>>>>>>>>> Riyadh 11675
>>>>>>>>> Saudi Arabia
>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sun, Mar 17, 2013 at 9:21 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> you can do that using these command :
>>>>>>>>>>
>>>>>>>>>> sudo gedit ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> then go to the end of the file and add this line
:
>>>>>>>>>> export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH
>>>>>>>>>>
>>>>>>>>>> after that use it to freeze the changes :
>>>>>>>>>> source ~/.bashrc
>>>>>>>>>>
>>>>>>>>>> to check it :
>>>>>>>>>> echo $HADOOP_HOME
>>>>>>>>>>
>>>>>>>>>> This will permanently set your HADOOP_HOME.
>>>>>>>>>>
>>>>>>>>>> HTH
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> https://mtariq.jux.com/
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani
<
>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Tariq, Could you please tell me how to set
HADOOP_HOME
>>>>>>>>>>> because I don't find it in the hadoop-env.sh
>>>>>>>>>>>
>>>>>>>>>>> Thank you Shashwat
>>>>>>>>>>> this is the output and it is already configured
but hadoopdon't read the configuration from here.
>>>>>>>>>>>
>>>>>>>>>>> /usr/share/maven-repo/org/apache
>>>>>>>>>>> /commons/commons-parent/22/commons-parent-22-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
>>>>>>>>>>> /commons-parent-debian-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
>>>>>>>>>>> /usr/share/maven-repo/org/apache/apache/debian/apache-debian
>>>>>>>>>>> -site.xml
>>>>>>>>>>> /usr/share/compiz/composite.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/mapred-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/core-site.xml
>>>>>>>>>>> /usr/share/hadoop/templates/conf/hdfs-site.xml
>>>>>>>>>>>
>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv
<
>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> try
>>>>>>>>>>>> find / -type f -iname "*site.xml"
>>>>>>>>>>>> it will show you where ever those files are..
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad
Alkahtani <
>>>>>>>>>>>> m.alkahtani@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> The problem is I tried I read the configuration
file by
>>>>>>>>>>>>> changing
>>>>>>>>>>>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>>>>>>>>>>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>> but I think Hadoop dosen't get the configration
from this dir,
>>>>>>>>>>>>> I trid and searched the system for conf
dir the only dir is this one which
>>>>>>>>>>>>> I changed.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Mohammad Alkahtani
>>>>>>>>>>>>> P.O.Box 102275
>>>>>>>>>>>>> Riyadh 11675
>>>>>>>>>>>>> Saudi Arabia
>>>>>>>>>>>>> mobile: 00966 555 33 1717
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat
shriparv <
>>>>>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Ye its is asking for file:/// instead
of hdfs:// just check
>>>>>>>>>>>>>> if it is taking setting configuration
from other location...
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ∞
>>>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 11:07 PM,
Luangsay Sourygna <
>>>>>>>>>>>>>> luangsay@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> What is the version of Hadoop
you use?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Try using fs.defaultFS instead
of fs.default.name (see the
>>>>>>>>>>>>>>> list of all
>>>>>>>>>>>>>>> the deprecated properties here:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>>>>>>>>>>>> ).
>>>>>>>>>>>>>>> I remember I once had a similar
error message and it was due
>>>>>>>>>>>>>>> to the
>>>>>>>>>>>>>>> change in properties names.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Regards,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Sourygna
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sun, Mar 17, 2013 at 2:32
PM, Mohammad Alkahtani
>>>>>>>>>>>>>>> <m.alkahtani@gmail.com>
wrote:
>>>>>>>>>>>>>>> > Hi to all users of Hadoop,
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > I installed Hadoop the .deb
file on Ubuntu 12.04 but I
>>>>>>>>>>>>>>> might could not
>>>>>>>>>>>>>>> > configure it right. The
conf dir is under templates in
>>>>>>>>>>>>>>> /usr/shar/hadoop. I
>>>>>>>>>>>>>>> > edit the core-site.xml,
mapred-site.xml files to give
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>fs.default.name</name>
>>>>>>>>>>>>>>> > <value>hdfs://localhost:9000</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> > and for mapred
>>>>>>>>>>>>>>> > <property>
>>>>>>>>>>>>>>> > <name>mapred.job.tracker</name>
>>>>>>>>>>>>>>> > <value>localhost:9001</value>
>>>>>>>>>>>>>>> > </property>
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > but i get these errors,
I assume that there is problem,
>>>>>>>>>>>>>>> Hadoop cannot read
>>>>>>>>>>>>>>> > the configuration file.
>>>>>>>>>>>>>>> > I chaned the hadoop-env.sh
to
>>>>>>>>>>>>>>> > export
>>>>>>>>>>>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>>>>>>>>>>>> > but dosen't solve the problem.
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException:
Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException:
Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: local at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883)
>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
>>>>>>>>>>>>>>> > java.lang.IllegalArgumentException:
Does not contain a
>>>>>>>>>>>>>>> valid host:port
>>>>>>>>>>>>>>> > authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:536)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1410)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1419)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Exception in thread "main"
>>>>>>>>>>>>>>> java.lang.IllegalArgumentException:
Does not
>>>>>>>>>>>>>>> > contain a valid host:port
authority: file:/// at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:167)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:135)
>>>>>>>>>>>>>>> > at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:650)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ________________________________
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > ERROR org.apache.hadoop.mapred.TaskTracker:
Can not start
>>>>>>>>>>>>>>> task tracker
>>>>>>>>>>>>>>> > because java.lang.IllegalArgumentException:
Does not
>>>>>>>>>>>>>>> contain a valid
>>>>>>>>>>>>>>> > host:port authority: local
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1532)
at
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3906)
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> >
>>>>>>>>>>>>>>> > Regards,
>>>>>>>>>>>>>>> > Mohammad Alkahtani
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message