Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id DC36310C14 for ; Tue, 23 Jul 2013 17:29:11 +0000 (UTC) Received: (qmail 83608 invoked by uid 500); 23 Jul 2013 17:29:05 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 83503 invoked by uid 500); 23 Jul 2013 17:29:04 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 83496 invoked by uid 99); 23 Jul 2013 17:29:04 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 23 Jul 2013 17:29:04 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,T_FRT_FOLLOW1,T_FRT_FOLLOW2 X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of ashish.umrani@gmail.com designates 74.125.82.48 as permitted sender) Received: from [74.125.82.48] (HELO mail-wg0-f48.google.com) (74.125.82.48) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 23 Jul 2013 17:29:00 +0000 Received: by mail-wg0-f48.google.com with SMTP id f11so7342257wgh.27 for ; Tue, 23 Jul 2013 10:28:39 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=ByvasX47lepgBnyvmQHdRmA+HQWjYlLvnUCevY2nm1g=; b=WgQoGJqP32AWgW++4hoo2RTYhsdE1MuC9FJXEyS5XobVvSCqeL4K5HiBbcCyIRLTmF SJZpSxsJX6aYUwaVTmiUUfEWUGJ2HmqKbtcrTCtKK0XfApBfMNFEOzn2bKvydu0CfZst psaEuk5SuBLCFgU5+Ey48nKpD1GJGKhHi0m1I9Vijal+5gaH/yKL7Uc3yFuS5W0I7MjY SDHwSygahuVAPchcM0CSQDU8lpYQqa3pTxdKkHeX5gH+E40YuKZiqwTwBsGMwULzFrd9 Tux+lbbGIzwu2ST7b4aAwYXovCKpgjq2EVvUfWAgaWMuVB+/j8+pE4Ov0OlgaLZbydsl 70lg== MIME-Version: 1.0 X-Received: by 10.194.109.10 with SMTP id ho10mr23535658wjb.72.1374600518995; Tue, 23 Jul 2013 10:28:38 -0700 (PDT) Received: by 10.194.158.230 with HTTP; Tue, 23 Jul 2013 10:28:38 -0700 (PDT) In-Reply-To: References: Date: Tue, 23 Jul 2013 10:28:38 -0700 Message-ID: Subject: Re: New hadoop 1.2 single node installation giving problems From: Ashish Umrani To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7bf1985aa4a03504e2311fe0 X-Virus-Checked: Checked by ClamAV on apache.org --047d7bf1985aa4a03504e2311fe0 Content-Type: text/plain; charset=ISO-8859-1 Jitendra, Som, Thanks. Issue was in not having any file there. Its working fine now. I am able to do -ls and could also do -mkdir and -put. Now is time to run the jar and apparently I am getting no main manifest attribute, in wc.jar But I believe its because of maven pom file does not have the main class entry. Which I go ahead and change the pom file and build it again, please let me know if you guys think of some other reason. Once again this user group rocks. I have never seen this quick a response. Regards ashish On Tue, Jul 23, 2013 at 10:21 AM, Jitendra Yadav wrote: > Try.. > > *hadoop fs -ls /* > > ** > Thanks > > > On Tue, Jul 23, 2013 at 10:27 PM, Ashish Umrani wrote: > >> Thanks Jitendra, Bejoy and Yexi, >> >> I got past that. And now the ls command says it can not access the >> directory. I am sure this is a permissions issue. I am just wondering >> which directory and I missing permissions on. >> >> Any pointers? >> >> And once again, thanks a lot >> >> Regards >> ashish >> >> *hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ >> hadoop fs -ls* >> *Warning: $HADOOP_HOME is deprecated.* >> * >> * >> *ls: Cannot access .: No such file or directory.* >> >> >> >> On Tue, Jul 23, 2013 at 9:42 AM, Jitendra Yadav < >> jeetuyadav200890@gmail.com> wrote: >> >>> Hi Ashish, >>> >>> Please check in hdfs-site.xml. >>> >>> It is missing. >>> >>> Thanks. >>> On Tue, Jul 23, 2013 at 9:58 PM, Ashish Umrani wrote: >>> >>>> Hey thanks for response. I have changed 4 files during installation >>>> >>>> core-site.xml >>>> mapred-site.xml >>>> hdfs-site.xml and >>>> hadoop-env.sh >>>> >>>> >>>> I could not find any issues except that all params in the hadoop-env.sh >>>> are commented out. Only java_home is un commented. >>>> >>>> If you have a quick minute can you please browse through these files in >>>> email and let me know where could be the issue. >>>> >>>> Regards >>>> ashish >>>> >>>> >>>> >>>> I am listing those files below. >>>> *core-site.xml * >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> hadoop.tmp.dir >>>> /app/hadoop/tmp >>>> A base for other temporary directories. >>>> >>>> >>>> >>>> fs.default.name >>>> hdfs://localhost:54310 >>>> The name of the default file system. A URI whose >>>> scheme and authority determine the FileSystem implementation. The >>>> uri's scheme determines the config property (fs.SCHEME.impl) naming >>>> the FileSystem implementation class. The uri's authority is used to >>>> determine the host, port, etc. for a filesystem. >>>> >>>> >>>> >>>> >>>> >>>> *mapred-site.xml* >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> mapred.job.tracker >>>> localhost:54311 >>>> The host and port that the MapReduce job tracker runs >>>> at. If "local", then jobs are run in-process as a single map >>>> and reduce task. >>>> >>>> >>>> >>>> >>>> >>>> >>>> *hdfs-site.xml and* >>>> >>>> >>>> >>>> >>>> >>>> >>>> dfs.replication >>>> 1 >>>> Default block replication. >>>> The actual number of replications can be specified when the file is >>>> created. >>>> The default is used if replication is not specified in create time. >>>> >>>> >>>> >>>> >>>> >>>> *hadoop-env.sh* >>>> # Set Hadoop-specific environment variables here. >>>> >>>> # The only required environment variable is JAVA_HOME. All others are >>>> # optional. When running a distributed configuration it is best to >>>> # set JAVA_HOME in this file, so that it is correctly defined on >>>> # remote nodes. >>>> >>>> # The java implementation to use. Required. >>>> export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_25 >>>> >>>> # Extra Java CLASSPATH elements. Optional. >>>> # export HADOOP_CLASSPATH= >>>> >>>> >>>> All pther params in hadoop-env.sh are commented >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> On Tue, Jul 23, 2013 at 8:38 AM, Jitendra Yadav < >>>> jeetuyadav200890@gmail.com> wrote: >>>> >>>>> Hi, >>>>> >>>>> You might have missed some configuration (XML tags ), Please check all >>>>> the Conf files. >>>>> >>>>> Thanks >>>>> On Tue, Jul 23, 2013 at 6:25 PM, Ashish Umrani < >>>>> ashish.umrani@gmail.com> wrote: >>>>> >>>>>> Hi There, >>>>>> >>>>>> First of all, sorry if I am asking some stupid question. Myself >>>>>> being new to the Hadoop environment , am finding it a bit difficult to >>>>>> figure out why its failing >>>>>> >>>>>> I have installed hadoop 1.2, based on instructions given in the >>>>>> folllowing link >>>>>> >>>>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ >>>>>> >>>>>> All went well and I could do the start-all.sh and the jps command >>>>>> does show all 5 process to be present. >>>>>> >>>>>> However when I try to do >>>>>> >>>>>> hadoop fs -ls >>>>>> >>>>>> I get the following error >>>>>> >>>>>> hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ >>>>>> hadoop fs -ls >>>>>> Warning: $HADOOP_HOME is deprecated. >>>>>> >>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>>> >>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>>> >>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>>> >>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>>> >>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>>> >>>>>> 13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not >>>>>> >>>>>> ls: Cannot access .: No such file or directory. >>>>>> hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ >>>>>> >>>>>> >>>>>> >>>>>> Can someone help me figure out whats the issue in my installation >>>>>> >>>>>> >>>>>> Regards >>>>>> ashish >>>>>> >>>>> >>>>> >>>> >>> >> > --047d7bf1985aa4a03504e2311fe0 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Jitendra, Som,

Thanks. =A0Issue was in = not having any file there. =A0Its working fine now. =A0

I am able to do -ls and could also do -mkdir and -put.

=
Now is time to run the jar and apparently I am getting=A0
no main manifest attribute, in wc.jar

=

But I believe its because of maven pom file does not ha= ve the main class entry.

Which I go ahead and change the pom file and build it a= gain, please let me know if you guys think of some other reason.
=
Once again this user group rocks. =A0I have never seen this = quick a response.

Regards
ashish


On Tue, Jul 23, 2013 at 10:21 AM= , Jitendra Yadav <jeetuyadav200890@gmail.com> wrote= :
Try..
=A0
hadoop fs -ls /

=A0

Thanks


On Tue, Jul 23, 2013 at 10:27 PM, Ashish Umrani = <ashish.umrani@gmail.com> wrote:
Thanks Jitendra, Bejoy and Yexi,=20

I got past that. =A0And now the ls command says it can not access the = directory. =A0I am sure this is a permissions issue. =A0I am just wondering= which directory and I missing permissions on.

Any pointers?

And once again, thanks a lot

Regards
ashish

hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ h= adoop fs -ls
Warning: $HADOOP_HOME is deprecated.

ls: Cannot access .: No such file or directory.



On Tue, Jul 23, 2013 at 9:42 AM, Jitendra Yadav = <jeetuyadav200890@gmail.com> wrote:
Hi Ashish,
=A0
Please check <property></property>=A0 in hdfs-site.xml.
=A0
It is missing.
=A0
Thanks.
On Tue, Jul 23, 2013 at 9:58 PM, Ashish Umrani <= span dir=3D"ltr"><ashish.umrani@gmail.com> wrote:
Hey thanks for response. =A0I have changed 4 files during = installation=20

core-site.xml=A0
mapred-site.xml
hdfs-site.xml =A0 and
hadoop-env.sh


I could not find any issues except that all params in the hadoop-env.s= h are commented out. =A0Only java_home is un commented.

If you have a quick minute can you please browse through these files i= n email and let me know where could be the issue.

Regards
ashish



I am listing those files below.
core-site.xml=A0
<?xml version=3D"1.0"?>
<?xml-stylesheet type=3D"text/xsl" href=3D"configura= tion.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
=A0 <property>
=A0 =A0 <name>hadoop.tmp.dir</name>
=A0 =A0 <value>/app/hadoop/tmp</value>
=A0 =A0 <description>A base for other temporary directories.<= /description>
=A0 </property>

=A0 <property>
=A0 =A0 <name>fs.default.name</name>
=A0 =A0 <value>hdfs://localhost:54310</value>
=A0 =A0 <description>The name of the default file system. =A0A U= RI whose
=A0 =A0 scheme and authority determine the FileSystem implementation. = =A0The
=A0 =A0 uri's scheme determines the config property (fs.SCHEME.imp= l) naming
=A0 =A0 the FileSystem implementation class. =A0The uri's authorit= y is used to
=A0 =A0 determine the host, port, etc. for a filesystem.</descripti= on>
=A0 </property>
</configuration>



mapred-site.xml
<?xml version=3D"1.0"?>
<?xml-stylesheet type=3D"text/xsl" href=3D"configura= tion.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
=A0 <property>
=A0 =A0 <name>mapred.job.tracker</name>
=A0 =A0 <value>localhost:54311</value>
=A0 =A0 <description>The host and port that the MapReduce job tr= acker runs
=A0 =A0 at. =A0If "local", then jobs are run in-process as a= single map
=A0 =A0 and reduce task.
=A0 =A0 </description>
=A0 </property>
</configuration>



hdfs-site.xml =A0 and
<?xml version=3D"1.0"?>
<?xml-stylesheet type=3D"text/xsl" href=3D"configura= tion.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
=A0 <name>dfs.replication</name>
=A0 <value>1</value>
=A0 <description>Default block replication.
=A0 =A0 The actual number of replications can be specified when the fi= le is created.
=A0 =A0 The default is used if replication is not specified in create = time.
=A0 </description>
</configuration>



hadoop-env.sh
# Set Hadoop-specific environment variables here.

# The only required environment variable is JAVA_HOME. =A0All others a= re
# optional. =A0When running a distributed configuration it is best to<= /div>
# set JAVA_HOME in this file, so that it is correctly defined on
# remote nodes.

# The java implementation to use. =A0Required.
export JAVA_HOME=3D/usr/lib/jvm/jdk1.7.0_25

# Extra Java CLASSPATH elements. =A0Optional.
# export HADOOP_CLASSPATH=3D


All pther params in hadoop-env.sh are commented








On Tue, Jul 23, 2013 at 8:38 AM, Jitendra Yadav = <jeetuyadav200890@gmail.com> wrote:
Hi,
=A0
You might have missed some configuration (XML tags ), Please check all= the Conf files.
=A0
Thanks
On Tue, Jul 23, 2013 at 6:25 PM, Ashish Umrani <ashis= h.umrani@gmail.com> wrote:
Hi There,=20

First of all, sorry if I am asking some stupid question. =A0Myself bei= ng new to the Hadoop environment , am finding it a bit difficult to figure = out why its failing

I have installed hadoop 1.2, based on instructions given in the folllo= wing link
http://www.michael-noll.c= om/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

All went well and I could do the start-all.sh and the jps command does= show all 5 process to be present.

However when I try to do

hadoop fs -ls

I get the following error

hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$ hado= op fs -ls
Warning: $HADOOP_HOME is deprecated.

13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
13/07/23 05:55:06 WARN conf.Configuration: bad conf file: element not = <property>
ls: Cannot access .: No such file or directory.
hduser@ashish-HP-Pavilion-dv6-Notebook-PC:/usr/local/hadoop/conf$=A0



Can someone help me figure out whats the issue in my installation


Regards
ashish






--047d7bf1985aa4a03504e2311fe0--