Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A2E2DF506 for ; Sat, 1 Jun 2013 20:22:45 +0000 (UTC) Received: (qmail 24465 invoked by uid 500); 1 Jun 2013 20:22:39 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 24304 invoked by uid 500); 1 Jun 2013 20:22:39 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 24289 invoked by uid 99); 1 Jun 2013 20:22:39 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 01 Jun 2013 20:22:39 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of raoshashidhar123@gmail.com designates 74.125.82.170 as permitted sender) Received: from [74.125.82.170] (HELO mail-we0-f170.google.com) (74.125.82.170) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 01 Jun 2013 20:15:09 +0000 Received: by mail-we0-f170.google.com with SMTP id w57so634071wes.29 for ; Sat, 01 Jun 2013 13:14:47 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=imF9FTRVfUA6AyRVQcNNrMxEzQe9PinbC7+MAK7F5LE=; b=GGjBll20zt43qXGEFC5PcrsPw3e2tI9tHGEqf5GnKso3qB8bwCEaUhw29v1Q8XkvIn DgRaizs93Ov90w9+2LbZzVp2Gh2aAUzSUZkmO4LyUeEvcqjvyqjVzdcQXd1nzMNaFRZw TER44wrpyigw9c0gMJDGTJD2DSh+gI/pQiO3CMgF+hQV5ywxBwG+fDLG8Q92ho3c/kl6 /LtKw9lW7jbemCM5SFpTHYsoa3XusbS/ubzOyd7nRHPxctfnyt1A1RhjljtnrIyt+w/t e41dUy4NViINXX4LFDnwF1eJsXtVk4HA3GHW/BbbeIJKQleDixMpG4Ml5rZFKXLLd8Q9 cZdQ== MIME-Version: 1.0 X-Received: by 10.180.91.131 with SMTP id ce3mr7405884wib.55.1370117687774; Sat, 01 Jun 2013 13:14:47 -0700 (PDT) Received: by 10.180.87.133 with HTTP; Sat, 1 Jun 2013 13:14:47 -0700 (PDT) In-Reply-To: References: Date: Sun, 2 Jun 2013 01:44:47 +0530 Message-ID: Subject: Re: MR2 submit job help From: Shashidhar Rao To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d04388e551498db04de1d6218 X-Virus-Checked: Checked by ClamAV on apache.org --f46d04388e551498db04de1d6218 Content-Type: text/plain; charset=ISO-8859-1 Shahab Yunus, You are right , thanks so much for your help , now I am able to see the file with bin/hadoop fs -ls / Thanks Shashidhar On Sun, Jun 2, 2013 at 12:51 AM, Shahab Yunus wrote: > Shashidhar, > > You ran the put command with 'fs' > bin/hadoop fs -put sample.txt /sample3.txt > > Have you tried the ls command with fs too? I see you are using 'hdfs'. Try > with 'fs'. > > Plus, the warning can be ignored. You would most probably need to build > the code with your native libs but I think it is not needed at this point. > > Regards, > Shahab > > > On Sat, Jun 1, 2013 at 1:22 PM, Shashidhar Rao > wrote: > >> Thanks Rahul, worked a bit but not fully >> >> Now when I run >> bin/hadoop fs -put sample.txt /sample3.txt >> >> The warning still comes >> 13/06/01 22:43:31 WARN util.NativeCodeLoader: Unable to load >> native-hadoop library for your platform... using builtin-java classes where >> applicable >> >> But, `sample.txt': No such file or directory error which came earlier do >> not came. >> >> Now when I run ,bin/hadoop hdfs -ls >> >> Error: Could not find or load main class hdfs >> >> I have exported the export HADOOP_HDFS_HOME=${HADOOP_HOME} where >> HADOOP_HOME is the root dir and in this dir I have all the hdfs*.jars and >> also the bin dir is inside HADOOP_HOME . >> >> Anything i have to export? >> >> Thanks >> shashidhar >> >> >> On Sat, Jun 1, 2013 at 10:27 PM, Rahul Bhattacharjee < >> rahul.rec.dgp@gmail.com> wrote: >> >>> Thats a warn >>> >>> You might not have the users dir in hdfs. >>> >>> try >>> >>> >>> hadoop fs -put sample.txt /sample.txt >>> >>> Rahul >>> >>> >>> On Sat, Jun 1, 2013 at 10:07 PM, Shashidhar Rao < >>> raoshashidhar123@gmail.com> wrote: >>> >>>> After building MR2 , I am able to start all the daemons >>>> 27873 Jps >>>> 27096 NameNode >>>> 27169 DataNode >>>> 27326 NodeManager >>>> 27398 JobHistoryServer >>>> 13329 >>>> 27257 ResourceManager >>>> 27043 Bootstrap >>>> >>>> but when I run bin/hadoop fs -put sample.txt sample.txt >>>> I get the error >>>> 13/06/01 22:05:48 WARN util.NativeCodeLoader: Unable to load >>>> native-hadoop library for your platform... using builtin-java classes where >>>> applicable >>>> put: `sample.txt': No such file or directory >>>> >>>> Native-hadoop library is not loaded >>>> >>>> what I need to do? >>>> >>>> Thanks >>>> shashidhar >>>> >>>> >>>> >>>> >>>> On Sat, Jun 1, 2013 at 6:41 PM, Shahab Yunus wrote: >>>> >>>>> Is you file name file.txt or file that you are trying to upload? Have >>>>> you made sure about that? Is any other command working? Have you tried >>>>> copyFromLocal? >>>>> >>>>> Regards, >>>>> Shahab >>>>> >>>>> >>>>> On Sat, Jun 1, 2013 at 4:05 AM, Rahul Bhattacharjee < >>>>> rahul.rec.dgp@gmail.com> wrote: >>>>> >>>>>> you should be able to use hadoop fs -put . file in the >>>>>> directory where you are running the command. >>>>>> >>>>>> >>>>>> >>>>>> On Sat, Jun 1, 2013 at 5:31 AM, Shashidhar Rao < >>>>>> raoshashidhar123@gmail.com> wrote: >>>>>> >>>>>>> Hi Users, >>>>>>> >>>>>>> Please help me with some documentation on how to submit job in YARN >>>>>>> and upload files in HDFS. Can I still use the MR1 commands for file >>>>>>> uploading to hadoop fs -put and hadoop jar job.jar input ouput? Because I >>>>>>> ran with errors saying file cannot be uploaded as file cannot found. The >>>>>>> directory structure is same . >>>>>>> >>>>>>> MR2 Directory structure/same as MR1 >>>>>>> hadoop/bin/~all hadoop files including hadoop excecutable >>>>>>> hadoop/ file.txt, job.jar etc >>>>>>> hadoop/etc/hadoop/ ~all site files and properties >>>>>>> >>>>>>> I cd to hadoop and then I am executing bin/hadoop fs -put file file >>>>>>> It says file cannot be found but I was able to run without error in >>>>>>> MR1 >>>>>>> >>>>>>> Thanks >>>>>>> shashidhar >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > --f46d04388e551498db04de1d6218 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Shah= ab Yunus,

You ar= e right , thanks so much for your help , now I am able to see the file with=
bin/= hadoop fs -ls /

Thanks
Shashidhar


On Sun, Jun 2, 2013 at 12:51 AM, Shahab Yunus <shahab.yunus@gmail.com= > wrote:
Shashidhar,

You ran the put command wit= h 'fs'
bin/hadoop fs -put sample.txt /sample3.txt

=
Have you tried the ls command with fs too? I see you are using = 9;hdfs'. Try with 'fs'.

Plu= s, the warning can be ignored. You would most probably need to build the co= de with your native libs but I think it is not needed at this point.=

Reg= ards,
Shahab

On Sat, Jun 1, 2013 at 1:22 PM, Shashidhar= Rao <raoshashidhar123@gmail.com> wrote:
Thanks Rahul, worked a bit but not fully

Now when I run
= bin/hadoop fs -put sample.txt /sample3.txt

The warning still comes
13/06/01 22:43:31 WARN util.NativeCod= eLoader: Unable to load native-hadoop library for your platform... using bu= iltin-java classes where applicable

But, `sample.txt': No such file or directory error which cam= e earlier do not came.

Now when I run ,bin/hadoop hdfs -ls
=
Error: Could not find or load main class hdfs

I have expor= ted the export HADOOP_HDFS_HOME=3D${HADOOP_HOME} where HADOOP_HOME is the r= oot dir and=A0 in this dir I have all the hdfs*.jars and also the bin dir i= s inside HADOOP_HOME .

Anything i have to export?

Thanks
shashidh= ar


On Sat, = Jun 1, 2013 at 10:27 PM, Rahul Bhattacharjee <rahul.rec.dgp@gmail.c= om> wrote:
Thats a warn

You might not have the users dir in hdfs.

=A0try


hadoop fs -put sample.txt /sample.txt

=
Rahul


On Sat, Jun 1, 2013 at 10:07 PM, Shashidhar Rao <raoshashidhar123@gmail.com> wrote:
After building MR2 , I am able to start all the daemons
27873 Jps
= 27096 NameNode
27169 DataNode
27326 NodeManager
27398 JobHistoryServer
13329
= 27257 ResourceManager
27043 Bootstrap

but when I run bin/hadoop fs -put sample.txt s= ample.txt
I get the error
13/06/01 22:05:48 WARN util.NativeCod= eLoader: Unable to load native-hadoop library for your platform... using bu= iltin-java classes where applicable
put: `sample.txt': No such file or directory

Native-hadoop= library is not loaded

what I need to do?

Thanks=
shashidhar




On Sat, Jun 1, 2013 at 6:41 PM, Sh= ahab Yunus <shahab.yunus@gmail.com> wrote:
Is you file name file.txt o= r file that you are trying to upload? Have you made sure about that? Is any= other command working? Have you tried copyFromLocal?

Regards,
Shahab


On Sat, Jun 1, 2013 at 4:05 AM, Rahul Bhattacharjee &l= t;rahul.rec.dg= p@gmail.com> wrote:
you should be able to use hadoop fs -= put <file> . file in the directory where you are running the command.=



On Sat, Jun 1= , 2013 at 5:31 AM, Shashidhar Rao <raoshashidhar123@gmail.com= > wrote:
Hi Users,

Please help me with some documentation on ho= w to submit job in YARN and upload files in HDFS. Can I still use the MR1 c= ommands for file uploading to hadoop fs -put and hadoop jar job.jar input o= uput? Because I ran with errors saying file cannot be uploaded as file cann= ot found. The directory structure is same .

MR2 Directory structure/same as MR1
hadoop/bin/~all= hadoop files including hadoop excecutable
hadoop/ file.txt, job.j= ar etc
hadoop/etc/hadoop/ ~all site files and properties

I cd to hadoop and then I am executing bin/hadoop fs -= put file file
It says file cannot be found but I was able to run w= ithout error in MR1

Thanks
shashidhar










--f46d04388e551498db04de1d6218--