Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9E0CC10056 for ; Wed, 26 Jun 2013 14:58:57 +0000 (UTC) Received: (qmail 25241 invoked by uid 500); 26 Jun 2013 14:58:52 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 25167 invoked by uid 500); 26 Jun 2013 14:58:52 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 25150 invoked by uid 99); 26 Jun 2013 14:58:52 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Jun 2013 14:58:52 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of shahab.yunus@gmail.com designates 209.85.214.50 as permitted sender) Received: from [209.85.214.50] (HELO mail-bk0-f50.google.com) (209.85.214.50) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Jun 2013 14:58:44 +0000 Received: by mail-bk0-f50.google.com with SMTP id ik8so5033910bkc.37 for ; Wed, 26 Jun 2013 07:58:23 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=0VAHjIKQfeD5xwlvv3TQZQ62sJz8UJ+thH1GyL6Uqtg=; b=wiZccIIJc6Ob/HXL4mWHsgDxmGvOrG/AP1yCX+oqEwu2e2HUoFn/rv8zlXqFMZQYFU p5sFrIW7KenWRZkW4Hf9owGPMddTGVG6Nd3+5z4nzlbKgwIZnfs5ZXIS+P6Ox6hLt2qv kGh7D/iGPQiezR/T/1tu1MdaycLPOG+QlNuxrBFOhbkGp/MQcuuFA/pcASYIpaJqhblL a/9pJBRaNwc/Yk4djQQkFBg9OOXRjUTy5BrSUlS2PEZNIUgLnpK4wiQZSH877IdUvDuC XcRQXGmgNITvRK6i2NzeuOZotfImXOkbvqs0QCDGQTWmcPBd1XQ77+FxHLz0IUvOQ4H4 PVNQ== MIME-Version: 1.0 X-Received: by 10.204.241.75 with SMTP id ld11mr591466bkb.78.1372258703238; Wed, 26 Jun 2013 07:58:23 -0700 (PDT) Received: by 10.204.188.71 with HTTP; Wed, 26 Jun 2013 07:58:23 -0700 (PDT) In-Reply-To: References: Date: Wed, 26 Jun 2013 10:58:23 -0400 Message-ID: Subject: Re: Can not follow Single Node Setup example. From: Shahab Yunus To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=20cf303640298c17d504e00fe060 X-Virus-Checked: Checked by ClamAV on apache.org --20cf303640298c17d504e00fe060 Content-Type: text/plain; charset=ISO-8859-1 Have you verified that the 'input' folder exists on the hdfs (singel node setup) that you are job needs? Regards, Shahab On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu wrote: > Hi, > > http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html > > I followed the above instructions. But I get the following errors. > Does anybody know what is wrong? Thanks. > > ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar > hadoop-examples-*.jar grep input output 'dfs[a-z.]+' > Warning: $HADOOP_HOME is deprecated. > > 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load > native-hadoop library for your platform... using builtin-java classes > where applicable > 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not loaded > 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to > process : 2 > 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area > > hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001 > 13/06/26 09:49:14 ERROR security.UserGroupInformation: > PriviledgedActionException as:py cause:java.io.IOException: Not a > file: hdfs://localhost:9000/user/py/input/conf > java.io.IOException: Not a file: hdfs://localhost:9000/user/py/input/conf > at > org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215) > at > org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051) > at > org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043) > at > org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959) > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) > at > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912) > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886) > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323) > at org.apache.hadoop.examples.Grep.run(Grep.java:69) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.examples.Grep.main(Grep.java:93) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) > at > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) > at > org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > > -- > Regards, > Peng > --20cf303640298c17d504e00fe060 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Have you verified that the 'input' folder exists o= n the hdfs (singel node setup) that you are job needs?

Regards,
Shahab
=

On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu <pengyu.ut@gmail.com> wrote:
Hi,

http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html<= /a>

I followed the above instructions. But I get the following errors.
Does anybody know what is wrong? Thanks.

~/Downloads/hadoop-install/hadoop$ bin/hadoop jar
hadoop-examples-*.jar grep input output 'dfs[a-z.]+'
Warning: $HADOOP_HOME is deprecated.

13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not loaded<= br> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to process= : 2
13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area
hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging= /job_201306260838_0001
13/06/26 09:49:14 ERROR security.UserGroupInformation:
PriviledgedActionException as:py cause:java.io.IOException: Not a
file: hdfs://localhost:9000/user/py/input/conf
java.io.IOException: Not a file: hdfs://localhost:9000/user/py/input/conf =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileI= nputFormat.java:215)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobCli= ent.java:1051)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient= .java:1043)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.access$700(JobClient.= java:179)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:= 959)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:= 912)
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(Native Metho= d)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subject.java:396)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(Use= rGroupInformation.java:1149)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.submitJobInternal(Job= Client.java:912)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.j= ava:886)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java= :1323)
=A0 =A0 =A0 =A0 at org.apache.hadoop.examples.Grep.run(Grep.java:69)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65= )
=A0 =A0 =A0 =A0 at org.apache.hadoop.examples.Grep.main(Grep.java:93)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.ProgramDriver$ProgramDescription.= invoke(ProgramDriver.java:68)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.ProgramDriver.driver(ProgramDrive= r.java:139)
=A0 =A0 =A0 =A0 at org.apache.hadoop.examples.ExampleDriver.main(ExampleDri= ver.java:64)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

--
Regards,
Peng

--20cf303640298c17d504e00fe060--