Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 11332CD50 for ; Thu, 27 Jun 2013 17:08:02 +0000 (UTC) Received: (qmail 49488 invoked by uid 500); 27 Jun 2013 17:07:56 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 49255 invoked by uid 500); 27 Jun 2013 17:07:51 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 49243 invoked by uid 99); 27 Jun 2013 17:07:50 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jun 2013 17:07:50 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of pengyu.ut@gmail.com designates 209.85.128.45 as permitted sender) Received: from [209.85.128.45] (HELO mail-qe0-f45.google.com) (209.85.128.45) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jun 2013 17:07:44 +0000 Received: by mail-qe0-f45.google.com with SMTP id w7so324939qeb.18 for ; Thu, 27 Jun 2013 10:07:23 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=6iK9Zwt+N+7PL0P7a0wSidJWoaUqh94e97y+iO31x7c=; b=1CbUxtCKS2Aq6rAHvmQPiKGE6ZE/vsDoCg5sjoMXpJCT5x6BBI55JzzekO2plZpnX9 P5e6LntfjcHxqphbi9uhXUMtFyeMyn2DtVEeopdpMNGWTq7GGM3egSKrfe/ribQPlLlv epEqlqEvdhm2yNtkft7hgV0UlbYE5AJFfiF/PAo15Kkd3vnNd/Ikud4EmVKOLP47huF7 tcQBArWsMwNBPMPYVqEp9NiILpxAUuVsdgfq5wDCObtJhhqfHHHCXcykT2CUg2H1mWRl qKQrlmTMOTxbD6kr2NDG50qvdCT/i4/JfRxGkRvLUDYW1t4rkCwviFC159MTmn4RhR4l exkQ== MIME-Version: 1.0 X-Received: by 10.224.57.82 with SMTP id b18mr13050667qah.36.1372352843221; Thu, 27 Jun 2013 10:07:23 -0700 (PDT) Received: by 10.224.163.139 with HTTP; Thu, 27 Jun 2013 10:07:23 -0700 (PDT) In-Reply-To: References: Date: Thu, 27 Jun 2013 12:07:23 -0500 Message-ID: Subject: Re: Can not follow Single Node Setup example. From: Peng Yu To: user@hadoop.apache.org Content-Type: text/plain; charset=ISO-8859-1 X-Virus-Checked: Checked by ClamAV on apache.org Hi, ~/Downloads/hadoop-install/hadoop$ rm -rf ~/input/conf/ ~/Downloads/hadoop-install/hadoop$ bin/hadoop fs -put conf input put: Target input/conf is a directory I get the above output. Is it the correct output? Thanks. On Wed, Jun 26, 2013 at 10:51 AM, Shahab Yunus wrote: > It is looking for a file within your login folder > /user/py/input/conf > > You are running your job form > hadoop/bin > and I think the hadoop job will is looking for files in the current folder. > > Regards, > Shahab > > > On Wed, Jun 26, 2013 at 11:02 AM, Peng Yu wrote: >> >> Hi, >> >> Here are what I have. >> >> ~/Downloads/hadoop-install/hadoop$ ls >> CHANGES.txt README.txt c++ hadoop-ant-1.1.2.jar >> hadoop-examples-1.1.2.jar hadoop-tools-1.1.2.jar ivy.xml logs >> src >> LICENSE.txt bin conf hadoop-client-1.1.2.jar >> hadoop-minicluster-1.1.2.jar input lib sbin >> webapps >> NOTICE.txt build.xml contrib hadoop-core-1.1.2.jar >> hadoop-test-1.1.2.jar ivy libexec share >> ~/Downloads/hadoop-install/hadoop$ ls input/ >> capacity-scheduler.xml core-site.xml fair-scheduler.xml >> hadoop-policy.xml hdfs-site.xml mapred-queue-acls.xml >> mapred-site.xml >> >> On Wed, Jun 26, 2013 at 10:00 AM, Shahab Yunus >> wrote: >> > Basically whether this step worked or not: >> > >> > $ cp conf/*.xml input >> > >> > Regards, >> > Shahab >> > >> > >> > On Wed, Jun 26, 2013 at 10:58 AM, Shahab Yunus >> > wrote: >> >> >> >> Have you verified that the 'input' folder exists on the hdfs (singel >> >> node >> >> setup) that you are job needs? >> >> >> >> Regards, >> >> Shahab >> >> >> >> >> >> On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu wrote: >> >>> >> >>> Hi, >> >>> >> >>> http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html >> >>> >> >>> I followed the above instructions. But I get the following errors. >> >>> Does anybody know what is wrong? Thanks. >> >>> >> >>> ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar >> >>> hadoop-examples-*.jar grep input output 'dfs[a-z.]+' >> >>> Warning: $HADOOP_HOME is deprecated. >> >>> >> >>> 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load >> >>> native-hadoop library for your platform... using builtin-java classes >> >>> where applicable >> >>> 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not >> >>> loaded >> >>> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to >> >>> process : 2 >> >>> 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area >> >>> >> >>> >> >>> hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001 >> >>> 13/06/26 09:49:14 ERROR security.UserGroupInformation: >> >>> PriviledgedActionException as:py cause:java.io.IOException: Not a >> >>> file: hdfs://localhost:9000/user/py/input/conf >> >>> java.io.IOException: Not a file: >> >>> hdfs://localhost:9000/user/py/input/conf >> >>> at >> >>> >> >>> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) >> >>> at java.security.AccessController.doPrivileged(Native Method) >> >>> at javax.security.auth.Subject.doAs(Subject.java:396) >> >>> at >> >>> >> >>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) >> >>> at >> >>> >> >>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886) >> >>> at >> >>> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323) >> >>> at org.apache.hadoop.examples.Grep.run(Grep.java:69) >> >>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) >> >>> at org.apache.hadoop.examples.Grep.main(Grep.java:93) >> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> at >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> >>> at >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> >>> at java.lang.reflect.Method.invoke(Method.java:597) >> >>> at >> >>> >> >>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) >> >>> at >> >>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) >> >>> at >> >>> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) >> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> at >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> >>> at >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> >>> at java.lang.reflect.Method.invoke(Method.java:597) >> >>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156) >> >>> >> >>> -- >> >>> Regards, >> >>> Peng >> >> >> >> >> > >> >> >> >> -- >> Regards, >> Peng > > -- Regards, Peng