Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id AC49D10B30 for ; Tue, 23 Jul 2013 17:09:36 +0000 (UTC) Received: (qmail 8489 invoked by uid 500); 23 Jul 2013 17:09:31 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 8367 invoked by uid 500); 23 Jul 2013 17:09:31 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 8344 invoked by uid 99); 23 Jul 2013 17:09:30 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 23 Jul 2013 17:09:30 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,T_FRT_FOLLOW1,T_FRT_FOLLOW2 X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of dontariq@gmail.com designates 209.85.220.175 as permitted sender) Received: from [209.85.220.175] (HELO mail-vc0-f175.google.com) (209.85.220.175) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 23 Jul 2013 17:09:20 +0000 Received: by mail-vc0-f175.google.com with SMTP id kw10so1604319vcb.20 for ; Tue, 23 Jul 2013 10:08:59 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=w+jdqVuXAJdqdnGMffIMnarAgknszZjN+lLj72FM780=; b=LDGZ9jSYBjOoAlfcMHy7R1r95JUpQhlb5u6PLN0mwSGAOHgpMyVwhy2bq5lWEAqSFf 7srimqyZl0AC5S1UV0Ak8Z9ZHJNRt8JiWpxfoFV4nMEGywnF4JBdU381QzJnEGA6ATA1 RQIaizupgx3laUYPtmWjod1BrkvmyGC0godJjXIazwE5eJ+GHc0d5nMUaCO72q18SOMk c7WrPetXB+wrLHJEtbSYGO4dc23UdNJ9kZdADv2MjIiHyd5g2WUPeGZbx+ZvKjCQ7RU2 xPQ5EV2fsLN4iErpd6dQAKCxIcuyzEilJhirxYOlpEDcvHKMFWU8alO3hAwEb7arh2nY j5Aw== X-Received: by 10.220.77.74 with SMTP id f10mr11585852vck.1.1374599339132; Tue, 23 Jul 2013 10:08:59 -0700 (PDT) MIME-Version: 1.0 Received: by 10.58.180.8 with HTTP; Tue, 23 Jul 2013 10:08:19 -0700 (PDT) In-Reply-To: References: From: Mohammad Tariq Date: Tue, 23 Jul 2013 22:38:19 +0530 Message-ID: Subject: Re: New hadoop 1.2 single node installation giving problems To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=001a11c2ed6451512104e230d90c X-Virus-Checked: Checked by ClamAV on apache.org --001a11c2ed6451512104e230d90c Content-Type: text/plain; charset=ISO-8859-1 Hello Ashish, Change the permissions of /app/hadoop/tmp to 755 and see if it helps. Warm Regards, Tariq cloudfront.blogspot.com On Tue, Jul 23, 2013 at 10:27 PM, Ashish Umrani wrote: > Thanks Jitendra, Bejoy and Yexi, > > I got past that. And now the ls command says it can not access the > directory. I am sure this is a permissions issue. I am just wondering > which directory and I missing permissions on. > > Any pointers? > > And once again, thanks a lot > > Regards > ashish > > *hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ hadoop > fs -ls* > *Warning: $HADOOP_HOME is deprecated.* > * > * > *ls: Cannot access .: No such file or directory.* > > > > On Tue, Jul 23, 2013 at 9:42 AM, Jitendra Yadav < > jeetuyadav200890@gmail.com> wrote: > >> Hi Ashish, >> >> Please check in hdfs-site.xml. >> >> It is missing. >> >> Thanks. >> On Tue, Jul 23, 2013 at 9:58 PM, Ashish Umrani wrote: >> >>> Hey thanks for response. I have changed 4 files during installation >>> >>> core-site.xml >>> mapred-site.xml >>> hdfs-site.xml and >>> hadoop-env.sh >>> >>> >>> I could not find any issues except that all params in the hadoop-env.sh >>> are commented out. Only java_home is un commented. >>> >>> If you have a quick minute can you please browse through these files in >>> email and let me know where could be the issue. >>> >>> Regards >>> ashish >>> >>> >>> >>> I am listing those files below. >>> *core-site.xml * >>> >>> >>> >>> >>> >>> >>> >>> hadoop.tmp.dir >>> /app/hadoop/tmp >>> A base for other temporary directories. >>> >>> >>> >>> fs.default.name >>> hdfs://localhost:54310 >>> The name of the default file system. A URI whose >>> scheme and authority determine the FileSystem implementation. The >>> uri's scheme determines the config property (fs.SCHEME.impl) naming >>> the FileSystem implementation class. The uri's authority is used to >>> determine the host, port, etc. for a filesystem. >>> >>> >>> >>> >>> >>> *mapred-site.xml* >>> >>> >>> >>> >>> >>> >>> >>> mapred.job.tracker >>> localhost:54311 >>> The host and port that the MapReduce job tracker runs >>> at. If "local", then jobs are run in-process as a single map >>> and reduce task. >>> >>> >>> >>> >>> >>> >>> *hdfs-site.xml and* >>> >>> >>> >>> >>> >>> >>> dfs.replication >>> 1 >>> Default block replication. >>> The actual number of replications can be specified when the file is >>> created. >>> The default is used if replication is not specified in create time. >>> >>> >>> >>> >>> >>> *hadoop-env.sh* >>> # Set Hadoop-specific environment variables here. >>> >>> # The only required environment variable is JAVA_HOME. All others are >>> # optional. When running a distributed configuration it is best to >>> # set JAVA_HOME in this file, so that it is correctly defined on >>> # remote nodes. >>> >>> # The java implementation to use. Required. >>> export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_25 >>> >>> # Extra Java CLASSPATH elements. Optional. >>> # export HADOOP_CLASSPATH= >>> >>> >>> All pther params in hadoop-env.sh are commented >>> >>> >>> >>> >>> >>> >>> >>> >>> On Tue, Jul 23, 2013 at 8:38 AM, Jitendra Yadav < >>> jeetuyadav200890@gmail.com> wrote: >>> >>>> Hi, >>>> >>>> You might have missed some configuration (XML tags ), Please check all >>>> the Conf files. >>>> >>>> Thanks >>>> On Tue, Jul 23, 2013 at 6:25 PM, Ashish Umrani >>> > wrote: >>>> >>>>> Hi There, >>>>> >>>>> First of all, sorry if I am asking some stupid question. Myself being >>>>> new to the Hadoop environment , am finding it a bit difficult to figure out >>>>> why its failing >>>>> >>>>> I have installed hadoop 1.2, based on instructions given in the >>>>> folllowing link >>>>> >>>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ >>>>> >>>>> All went well and I could do the start-all.sh and the jps command does >>>>> show all 5 process to be present. >>>>> >>>>> However when I try to do >>>>> >>>>> hadoop fs -ls >>>>> >>>>> I get the following error >>>>> >>>>> hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ >>>>> hadoop fs -ls >>>>> Warning: $HADOOP_HOME is deprecated. >>>>> >>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>> >>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>> >>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>> >>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>> >>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>> >>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>> >>>>> ls: Cannot access .: No such file or directory. >>>>> hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ >>>>> >>>>> >>>>> >>>>> Can someone help me figure out whats the issue in my installation >>>>> >>>>> >>>>> Regards >>>>> ashish >>>>> >>>> >>>> >>> >> > --001a11c2ed6451512104e230d90c Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hello Ashish,

Change the permissions of= =A0/app/hadoop/= tmp to 755 and see if it helps.

Warm Regards,
Tariq


On Tue, Jul 23, 2013 at 10:27 PM, Ashish= Umrani <ashish.umrani@gmail.com> wrote:
Thanks Jitendra, Bejoy and Yexi,

I got = past that. =A0And now the ls command says it can not access the directory. = =A0I am sure this is a permissions issue. =A0I am just wondering which dire= ctory and I missing permissions on.

Any pointers?

And once again, = thanks a lot

Regards
<= font color=3D"#888888">
ashish

hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/lo= cal/hadoop/conf$ hadoop fs -ls
Warning: $HADOOP_HOME is deprecated.

ls: Cannot access .: No such file or direc= tory.



On Tue, Jul 23, 2013 at 9:42 AM, Jitendra Yadav <jeetuyadav200890= @gmail.com> wrote:
Hi Ashish,
=A0
Please check <property></property>=A0 in hdfs-site.xml.
=A0
It is missing.
=A0
Thanks.
On Tue, Jul 23, 2013 at 9:58 PM, Ashish Umrani <= span dir=3D"ltr"><ashish.umrani@gmail.com> wrote:
Hey thanks for response. =A0I have changed 4 files during = installation=20

core-site.xml=A0
mapred-site.xml
hdfs-site.xml =A0 and
hadoop-env.sh


I could not find any issues except that all params in the hadoop-env.s= h are commented out. =A0Only java_home is un commented.

If you have a quick minute can you please browse through these files i= n email and let me know where could be the issue.

Regards
ashish



I am listing those files below.
core-site.xml=A0
<?xml version=3D"1.0"?>
<?xml-stylesheet type=3D"text/xsl" href=3D"configura= tion.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
=A0 <property>
=A0 =A0 <name>hadoop.tmp.dir</name>
=A0 =A0 <value>/app/hadoop/tmp</value>
=A0 =A0 <description>A base for other temporary directories.<= /description>
=A0 </property>

=A0 <property>
=A0 =A0 <name>fs.default.name</name>
=A0 =A0 <value>hdfs://localhost:54310</value>
=A0 =A0 <description>The name of the default file system. =A0A U= RI whose
=A0 =A0 scheme and authority determine the FileSystem implementation. = =A0The
=A0 =A0 uri's scheme determines the config property (fs.SCHEME.imp= l) naming
=A0 =A0 the FileSystem implementation class. =A0The uri's authorit= y is used to
=A0 =A0 determine the host, port, etc. for a filesystem.</descripti= on>
=A0 </property>
</configuration>



mapred-site.xml
<?xml version=3D"1.0"?>
<?xml-stylesheet type=3D"text/xsl" href=3D"configura= tion.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
=A0 <property>
=A0 =A0 <name>mapred.job.tracker</name>
=A0 =A0 <value>localhost:54311</value>
=A0 =A0 <description>The host and port that the MapReduce job tr= acker runs
=A0 =A0 at. =A0If "local", then jobs are run in-process as a= single map
=A0 =A0 and reduce task.
=A0 =A0 </description>
=A0 </property>
</configuration>



hdfs-site.xml =A0 and
<?xml version=3D"1.0"?>
<?xml-stylesheet type=3D"text/xsl" href=3D"configura= tion.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
=A0 <name>dfs.replication</name>
=A0 <value>1</value>
=A0 <description>Default block replication.
=A0 =A0 The actual number of replications can be specified when the fi= le is created.
=A0 =A0 The default is used if replication is not specified in create = time.
=A0 </description>
</configuration>



hadoop-env.sh
# Set Hadoop-specific environment variables here.

# The only required environment variable is JAVA_HOME. =A0All others a= re
# optional. =A0When running a distributed configuration it is best to<= /div>
# set JAVA_HOME in this file, so that it is correctly defined on
# remote nodes.

# The java implementation to use. =A0Required.
export JAVA_HOME=3D/usr/lib/jvm/jdk1.7.0_25

# Extra Java CLASSPATH elements. =A0Optional.
# export HADOOP_CLASSPATH=3D


All pther params in hadoop-env.sh are commented








On Tue, Jul 23, 2013 at 8:38 AM, Jitendra Yadav = <jeetuyadav200890@gmail.com> wrote:
Hi,
=A0
You might have missed some configuration (XML tags ), Please check all= the Conf files.
=A0
Thanks
On Tue, Jul 23, 2013 at 6:25 PM, Ashish Umrani <ashis= h.umrani@gmail.com> wrote:
Hi There,=20

First of all, sorry if I am asking some stupid question. =A0Myself bei= ng new to the Hadoop environment , am finding it a bit difficult to figure = out why its failing

I have installed hadoop 1.2, based on instructions given in the folllo= wing link
http://www.michael-noll.c= om/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

All went well and I could do the start-all.sh and the jps command does= show all 5 process to be present.

However when I try to do

hadoop fs -ls

I get the following error

hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ hado= op fs -ls
Warning: $HADOOP_HOME is deprecated.

13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
ls: Cannot access .: No such file or directory.
hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$=A0



Can someone help me figure out whats the issue in my installation


Regards
ashish





--001a11c2ed6451512104e230d90c--