Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 3BC1A1006F for ; Wed, 26 Jun 2013 15:01:32 +0000 (UTC) Received: (qmail 34603 invoked by uid 500); 26 Jun 2013 15:01:26 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 34508 invoked by uid 500); 26 Jun 2013 15:01:26 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 34501 invoked by uid 99); 26 Jun 2013 15:01:25 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Jun 2013 15:01:25 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of shahab.yunus@gmail.com designates 209.85.214.53 as permitted sender) Received: from [209.85.214.53] (HELO mail-bk0-f53.google.com) (209.85.214.53) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Jun 2013 15:01:17 +0000 Received: by mail-bk0-f53.google.com with SMTP id e11so4899684bkh.26 for ; Wed, 26 Jun 2013 08:00:57 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=h//Xzt4IN0v04XZDfwoltMaHPB7PgugDyaK5wo5NOd8=; b=j+oLFRtm3cEqf+EhmRR7h02H3PZeYWpo+boLHhLppd39j7ntsIkM3SmhlKQxG38Ipk OscEvP6/YxG4piy04M7FJw/3HA3IFfFSd3K8EQ9k3bUS60NgyPYlhMVZDTXOqv69ddeU xS+YlP4xQvl/O3aCDt/ga8gHpYyORGUNhdPbVcxSM/YdG33NxEUVBV7ygX78fVpRdVKO zT/WyFEDMhx7ev2RC9OiAMX74/9xluTydn7PWZbYsuExgf1TK68bgtYnnBQiHRbC2+ct u3KSDWzo30RjuUjgmJMN8EatP/4utRmHE2SGcRIac4tfYwdV/N/OT5cgH37rtqYhe/b1 v99w== MIME-Version: 1.0 X-Received: by 10.205.114.138 with SMTP id fa10mr578944bkc.30.1372258857053; Wed, 26 Jun 2013 08:00:57 -0700 (PDT) Received: by 10.204.188.71 with HTTP; Wed, 26 Jun 2013 08:00:57 -0700 (PDT) In-Reply-To: References: Date: Wed, 26 Jun 2013 11:00:57 -0400 Message-ID: Subject: Re: Can not follow Single Node Setup example. From: Shahab Yunus To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=14dae9c0981eb7232c04e00fe987 X-Virus-Checked: Checked by ClamAV on apache.org --14dae9c0981eb7232c04e00fe987 Content-Type: text/plain; charset=ISO-8859-1 Basically whether this step worked or not: $ cp conf/*.xml input Regards, Shahab On Wed, Jun 26, 2013 at 10:58 AM, Shahab Yunus wrote: > Have you verified that the 'input' folder exists on the hdfs (singel node > setup) that you are job needs? > > Regards, > Shahab > > > On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu wrote: > >> Hi, >> >> http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html >> >> I followed the above instructions. But I get the following errors. >> Does anybody know what is wrong? Thanks. >> >> ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar >> hadoop-examples-*.jar grep input output 'dfs[a-z.]+' >> Warning: $HADOOP_HOME is deprecated. >> >> 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load >> native-hadoop library for your platform... using builtin-java classes >> where applicable >> 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not loaded >> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to >> process : 2 >> 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area >> >> hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001 >> 13/06/26 09:49:14 ERROR security.UserGroupInformation: >> PriviledgedActionException as:py cause:java.io.IOException: Not a >> file: hdfs://localhost:9000/user/py/input/conf >> java.io.IOException: Not a file: hdfs://localhost:9000/user/py/input/conf >> at >> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215) >> at >> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051) >> at >> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043) >> at >> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) >> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959) >> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:396) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) >> at >> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912) >> at >> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886) >> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323) >> at org.apache.hadoop.examples.Grep.run(Grep.java:69) >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) >> at org.apache.hadoop.examples.Grep.main(Grep.java:93) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> at java.lang.reflect.Method.invoke(Method.java:597) >> at >> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) >> at >> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) >> at >> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> at java.lang.reflect.Method.invoke(Method.java:597) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:156) >> >> -- >> Regards, >> Peng >> > > --14dae9c0981eb7232c04e00fe987 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Basic= ally whether this step worked or not:

$ cp= conf/*.xml input=A0

Regards,
Shahab


On Wed, Jun 26, 2013 = at 10:58 AM, Shahab Yunus <shahab.yunus@gmail.com> wrot= e:
Have you verified that the = 'input' folder exists on the hdfs (singel node setup) that you are = job needs?

Regards,
Shahab
<= div class=3D"h5">


On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu <pengyu.ut@gmail.com> wrote:
Hi,

http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html<= /a>

I followed the above instructions. But I get the following errors.
Does anybody know what is wrong? Thanks.

~/Downloads/hadoop-install/hadoop$ bin/hadoop jar
hadoop-examples-*.jar grep input output 'dfs[a-z.]+'
Warning: $HADOOP_HOME is deprecated.

13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not loaded<= br> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to process= : 2
13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area
hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging= /job_201306260838_0001
13/06/26 09:49:14 ERROR security.UserGroupInformation:
PriviledgedActionException as:py cause:java.io.IOException: Not a
file: hdfs://localhost:9000/user/py/input/conf
java.io.IOException: Not a file: hdfs://localhost:9000/user/py/input/conf =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileI= nputFormat.java:215)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobCli= ent.java:1051)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient= .java:1043)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.access$700(JobClient.= java:179)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:= 959)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:= 912)
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(Native Metho= d)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subject.java:396)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(Use= rGroupInformation.java:1149)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.submitJobInternal(Job= Client.java:912)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.j= ava:886)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java= :1323)
=A0 =A0 =A0 =A0 at org.apache.hadoop.examples.Grep.run(Grep.java:69)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65= )
=A0 =A0 =A0 =A0 at org.apache.hadoop.examples.Grep.main(Grep.java:93)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.ProgramDriver$ProgramDescription.= invoke(ProgramDriver.java:68)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.ProgramDriver.driver(ProgramDrive= r.java:139)
=A0 =A0 =A0 =A0 at org.apache.hadoop.examples.ExampleDriver.main(ExampleDri= ver.java:64)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0 at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

--
Regards,
Peng


--14dae9c0981eb7232c04e00fe987--