Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id E6032CEE5 for ; Thu, 27 Jun 2013 17:41:37 +0000 (UTC) Received: (qmail 56615 invoked by uid 500); 27 Jun 2013 17:41:31 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 56501 invoked by uid 500); 27 Jun 2013 17:41:27 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 56464 invoked by uid 99); 27 Jun 2013 17:41:25 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jun 2013 17:41:25 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dontariq@gmail.com designates 209.85.212.47 as permitted sender) Received: from [209.85.212.47] (HELO mail-vb0-f47.google.com) (209.85.212.47) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jun 2013 17:41:18 +0000 Received: by mail-vb0-f47.google.com with SMTP id x14so921477vbb.34 for ; Thu, 27 Jun 2013 10:40:58 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=dUHXZu2gSlgyN55h92dBZSVeY1udnBS6S1ukraJ8Bp8=; b=Pm0SkRSmJht1pLPPRqJ9h0eoFz4VnTVKzmy6FXGlD/sKijROZYvXkxDoZcxN0lAQNv vOSns9mhCYUCSNuQsRXDu2UDHwZLCRHkwT0yjVJNvr542qefk9NpfEGn1WtcEX4HBOT/ dzV8Kgd79IJHozeTtWtuQzpRjmwshvYP7NMuT4/mDqk/ZhbzovNbqqiyG9VUtvcku9wX 8E77vprDXrnoaRy0BZB7o17D0JBqQC8GSqO9ZzKKdMPR9RfXTqcAvVJ9lvJxF+cpsZ+J vttS3Id7yLd0HvlCJv6/7GzeLKKW4bmWvrMdUtKorD3FmN0ppWSjNOIahql2w91BzC4a GxnQ== X-Received: by 10.220.191.5 with SMTP id dk5mr3935040vcb.47.1372354857840; Thu, 27 Jun 2013 10:40:57 -0700 (PDT) MIME-Version: 1.0 Received: by 10.58.180.8 with HTTP; Thu, 27 Jun 2013 10:40:17 -0700 (PDT) In-Reply-To: References: From: Mohammad Tariq Date: Thu, 27 Jun 2013 23:10:17 +0530 Message-ID: Subject: Re: Can not follow Single Node Setup example. To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=001a11c1bfaaceba2104e02643e6 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c1bfaaceba2104e02643e6 Content-Type: text/plain; charset=ISO-8859-1 No. This means that you are trying to copy an entire directory instead of a file. Do this : bin/hadoop fs -put conf/ /input/ Warm Regards, Tariq cloudfront.blogspot.com On Thu, Jun 27, 2013 at 10:37 PM, Peng Yu wrote: > Hi, > > ~/Downloads/hadoop-install/hadoop$ rm -rf ~/input/conf/ > ~/Downloads/hadoop-install/hadoop$ bin/hadoop fs -put conf input > put: Target input/conf is a directory > > I get the above output. Is it the correct output? Thanks. > > On Wed, Jun 26, 2013 at 10:51 AM, Shahab Yunus > wrote: > > It is looking for a file within your login folder > > /user/py/input/conf > > > > You are running your job form > > hadoop/bin > > and I think the hadoop job will is looking for files in the current > folder. > > > > Regards, > > Shahab > > > > > > On Wed, Jun 26, 2013 at 11:02 AM, Peng Yu wrote: > >> > >> Hi, > >> > >> Here are what I have. > >> > >> ~/Downloads/hadoop-install/hadoop$ ls > >> CHANGES.txt README.txt c++ hadoop-ant-1.1.2.jar > >> hadoop-examples-1.1.2.jar hadoop-tools-1.1.2.jar ivy.xml logs > >> src > >> LICENSE.txt bin conf hadoop-client-1.1.2.jar > >> hadoop-minicluster-1.1.2.jar input lib sbin > >> webapps > >> NOTICE.txt build.xml contrib hadoop-core-1.1.2.jar > >> hadoop-test-1.1.2.jar ivy libexec share > >> ~/Downloads/hadoop-install/hadoop$ ls input/ > >> capacity-scheduler.xml core-site.xml fair-scheduler.xml > >> hadoop-policy.xml hdfs-site.xml mapred-queue-acls.xml > >> mapred-site.xml > >> > >> On Wed, Jun 26, 2013 at 10:00 AM, Shahab Yunus > >> wrote: > >> > Basically whether this step worked or not: > >> > > >> > $ cp conf/*.xml input > >> > > >> > Regards, > >> > Shahab > >> > > >> > > >> > On Wed, Jun 26, 2013 at 10:58 AM, Shahab Yunus < > shahab.yunus@gmail.com> > >> > wrote: > >> >> > >> >> Have you verified that the 'input' folder exists on the hdfs (singel > >> >> node > >> >> setup) that you are job needs? > >> >> > >> >> Regards, > >> >> Shahab > >> >> > >> >> > >> >> On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu > wrote: > >> >>> > >> >>> Hi, > >> >>> > >> >>> http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html > >> >>> > >> >>> I followed the above instructions. But I get the following errors. > >> >>> Does anybody know what is wrong? Thanks. > >> >>> > >> >>> ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar > >> >>> hadoop-examples-*.jar grep input output 'dfs[a-z.]+' > >> >>> Warning: $HADOOP_HOME is deprecated. > >> >>> > >> >>> 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load > >> >>> native-hadoop library for your platform... using builtin-java > classes > >> >>> where applicable > >> >>> 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not > >> >>> loaded > >> >>> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to > >> >>> process : 2 > >> >>> 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging > area > >> >>> > >> >>> > >> >>> > hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001 > >> >>> 13/06/26 09:49:14 ERROR security.UserGroupInformation: > >> >>> PriviledgedActionException as:py cause:java.io.IOException: Not a > >> >>> file: hdfs://localhost:9000/user/py/input/conf > >> >>> java.io.IOException: Not a file: > >> >>> hdfs://localhost:9000/user/py/input/conf > >> >>> at > >> >>> > >> >>> > org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215) > >> >>> at > >> >>> > org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051) > >> >>> at > >> >>> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043) > >> >>> at > >> >>> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) > >> >>> at > >> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959) > >> >>> at > >> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) > >> >>> at java.security.AccessController.doPrivileged(Native > Method) > >> >>> at javax.security.auth.Subject.doAs(Subject.java:396) > >> >>> at > >> >>> > >> >>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) > >> >>> at > >> >>> > >> >>> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912) > >> >>> at > >> >>> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886) > >> >>> at > >> >>> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323) > >> >>> at org.apache.hadoop.examples.Grep.run(Grep.java:69) > >> >>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > >> >>> at org.apache.hadoop.examples.Grep.main(Grep.java:93) > >> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > >> >>> at > >> >>> > >> >>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > >> >>> at > >> >>> > >> >>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > >> >>> at java.lang.reflect.Method.invoke(Method.java:597) > >> >>> at > >> >>> > >> >>> > org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) > >> >>> at > >> >>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) > >> >>> at > >> >>> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) > >> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > >> >>> at > >> >>> > >> >>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > >> >>> at > >> >>> > >> >>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > >> >>> at java.lang.reflect.Method.invoke(Method.java:597) > >> >>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > >> >>> > >> >>> -- > >> >>> Regards, > >> >>> Peng > >> >> > >> >> > >> > > >> > >> > >> > >> -- > >> Regards, > >> Peng > > > > > > > > -- > Regards, > Peng > --001a11c1bfaaceba2104e02643e6 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
No. This means that you are trying to copy an entire direc= tory instead of a file. Do this :
bin/hadoop fs -put conf/ =A0/input/
=

Warm Reg= ards,
Tariq


On Thu, Jun 27, 2013 at 10:37 PM, Peng Y= u <pengyu.ut@gmail.com> wrote:
Hi,

~/Downloads/hadoop-install/hadoop$ rm -rf ~/input/conf/
~/Downloads/hadoop-install/hadoop$ bin/hadoop fs -put conf input
put: Target input/conf is a directory

I get the above output. Is it the correct output? Thanks.

On Wed, Jun 26, 2013 at 10:51 AM, Shahab Yunus <shahab.yunus@gmail.com> wrote:
> It is looking for a file within your login folder
> /user/py/input/conf
>
> You are running your job form
> hadoop/bin
> and I think the hadoop job will is looking for files in the current fo= lder.
>
> Regards,
> Shahab
>
>
> On Wed, Jun 26, 2013 at 11:02 AM, Peng Yu <pengyu.ut@gmail.com> wrote:
>>
>> Hi,
>>
>> Here are what I have.
>>
>> ~/Downloads/hadoop-install/hadoop$ ls
>> CHANGES.txt =A0README.txt =A0c++ =A0 =A0 =A0hadoop-ant-1.1.2.jar >> hadoop-examples-1.1.2.jar =A0 =A0 hadoop-tools-1.1.2.jar =A0ivy.xm= l =A0logs
>> src
>> LICENSE.txt =A0bin =A0 =A0 =A0 =A0 conf =A0 =A0 hadoop-client-1.1.= 2.jar
>> hadoop-minicluster-1.1.2.jar =A0input =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 lib =A0 =A0 =A0sbin
>> webapps
>> NOTICE.txt =A0 build.xml =A0 contrib =A0hadoop-core-1.1.2.jar
>> hadoop-test-1.1.2.jar =A0 =A0 =A0 =A0 ivy =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 libexec =A0share
>> ~/Downloads/hadoop-install/hadoop$ ls input/
>> capacity-scheduler.xml =A0core-site.xml =A0fair-scheduler.xml
>> hadoop-policy.xml =A0hdfs-site.xml =A0mapred-queue-acls.xml
>> mapred-site.xml
>>
>> On Wed, Jun 26, 2013 at 10:00 AM, Shahab Yunus <shahab.yunus@gmail.com>
>> wrote:
>> > Basically whether this step worked or not:
>> >
>> > $ cp conf/*.xml input
>> >
>> > Regards,
>> > Shahab
>> >
>> >
>> > On Wed, Jun 26, 2013 at 10:58 AM, Shahab Yunus <shahab.yunus@gmail.com>
>> > wrote:
>> >>
>> >> Have you verified that the 'input' folder exists = on the hdfs (singel
>> >> node
>> >> setup) that you are job needs?
>> >>
>> >> Regards,
>> >> Shahab
>> >>
>> >>
>> >> On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu <pengyu.ut@gmail.com> wrote:
>> >>>
>> >>> Hi,
>> >>>
>> >>> http://hadoop.apache.org/docs/r1.1.2/s= ingle_node_setup.html
>> >>>
>> >>> I followed the above instructions. But I get the foll= owing errors.
>> >>> Does anybody know what is wrong? Thanks.
>> >>>
>> >>> ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar
>> >>> hadoop-examples-*.jar grep input output 'dfs[a-z.= ]+'
>> >>> Warning: $HADOOP_HOME is deprecated.
>> >>>
>> >>> 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable = to load
>> >>> native-hadoop library for your platform... using buil= tin-java classes
>> >>> where applicable
>> >>> 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy nati= ve library not
>> >>> loaded
>> >>> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total = input paths to
>> >>> process : 2
>> >>> 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up = the staging area
>> >>>
>> >>>
>> >>> hdfs://localhost:9000/opt/local/var/hadoop/cache/mapr= ed/staging/py/.staging/job_201306260838_0001
>> >>> 13/06/26 09:49:14 ERROR security.UserGroupInformation= :
>> >>> PriviledgedActionException as:py cause:java.io.IOExce= ption: Not a
>> >>> file: hdfs://localhost:9000/user/py/input/conf
>> >>> java.io.IOException: Not a file:
>> >>> hdfs://localhost:9000/user/py/input/conf
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> org.apache.hadoop.mapred.FileInputFormat.getSplits(Fi= leInputFormat.java:215)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient.writeOldSplits(Job= Client.java:1051)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient.writeSplits(JobCli= ent.java:1043)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient.access$700(JobClie= nt.java:179)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.ja= va:959)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.ja= va:912)
>> >>> =A0 =A0 =A0 =A0 at java.security.AccessController.doP= rivileged(Native Method)
>> >>> =A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(S= ubject.java:396)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> org.apache.hadoop.security.UserGroupInformation.doAs(= UserGroupInformation.java:1149)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> org.apache.hadoop.mapred.JobClient.submitJobInternal(= JobClient.java:912)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient.submitJob(JobClien= t.java:886)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.mapred.JobClient.runJob(JobClient.j= ava:1323)
>> >>> =A0 =A0 =A0 =A0 at org.apache.hadoop.examples.Grep.ru= n(Grep.java:69)
>> >>> =A0 =A0 =A0 =A0 at org.apache.hadoop.util.ToolRunner.= run(ToolRunner.java:65)
>> >>> =A0 =A0 =A0 =A0 at org.apache.hadoop.examples.Grep.ma= in(Grep.java:93)
>> >>> =A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorIm= pl.invoke0(Native Method)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMet= hodAccessorImpl.java:39)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Deleg= atingMethodAccessorImpl.java:25)
>> >>> =A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Me= thod.java:597)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> org.apache.hadoop.util.ProgramDriver$ProgramDescripti= on.invoke(ProgramDriver.java:68)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDr= iver.java:139)
>> >>> =A0 =A0 =A0 =A0 at
>> >>> org.apache.hadoop.examples.ExampleDriver.main(Example= Driver.java:64)
>> >>> =A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorIm= pl.invoke0(Native Method)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMet= hodAccessorImpl.java:39)
>> >>> =A0 =A0 =A0 =A0 at
>> >>>
>> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Deleg= atingMethodAccessorImpl.java:25)
>> >>> =A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Me= thod.java:597)
>> >>> =A0 =A0 =A0 =A0 at org.apache.hadoop.util.RunJar.main= (RunJar.java:156)
>> >>>
>> >>> --
>> >>> Regards,
>> >>> Peng
>> >>
>> >>
>> >
>>
>>
>>
>> --
>> Regards,
>> Peng
>
>



--
Regards,
Peng

--001a11c1bfaaceba2104e02643e6--