Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6A305CFEE for ; Wed, 26 Jun 2013 14:54:02 +0000 (UTC) Received: (qmail 8991 invoked by uid 500); 26 Jun 2013 14:53:57 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 5560 invoked by uid 500); 26 Jun 2013 14:53:51 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 5524 invoked by uid 99); 26 Jun 2013 14:53:50 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Jun 2013 14:53:50 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of pengyu.ut@gmail.com designates 209.85.216.46 as permitted sender) Received: from [209.85.216.46] (HELO mail-qa0-f46.google.com) (209.85.216.46) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Jun 2013 14:53:44 +0000 Received: by mail-qa0-f46.google.com with SMTP id ih17so1634634qab.19 for ; Wed, 26 Jun 2013 07:53:23 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=FRAg3XSd6XEQex1Pd23HB6XcB+CGj3TrS8RKyxORYBQ=; b=IPnB0t7L7mnMm1QcqhN78WR4cQxnxw2T0sT2yWXFVUTND9vFL3lGowkGFDDlOs4VQR ARLR7FHGH1swKAzWYEhXT49MZbPgIW9YKM0be3/3YNrpZMCRs8FZS86NK/desQB4V7tB diIermG3eDz7yTtk8I/mntIxnJ/j9oV9pA641ZkofJbwCOmIaZ3nlPjJz0KWDrTdCDwW fJo+gyI2IxC8bUI3EByVwv2VUGsaKVTlkj39SOVovvSanCw3lXc4WUtP9ETcNgsivWOh LFXcoGTV4JlFdzqOOzHmR/p8GdRTTojMBSD43dbzYUA575Og0+2NImqYjNAZIrYMKJqX EU9A== MIME-Version: 1.0 X-Received: by 10.224.212.199 with SMTP id gt7mr5970787qab.80.1372258403260; Wed, 26 Jun 2013 07:53:23 -0700 (PDT) Received: by 10.224.163.139 with HTTP; Wed, 26 Jun 2013 07:53:23 -0700 (PDT) Date: Wed, 26 Jun 2013 09:53:23 -0500 Message-ID: Subject: Can not follow Single Node Setup example. From: Peng Yu To: user@hadoop.apache.org Content-Type: text/plain; charset=ISO-8859-1 X-Virus-Checked: Checked by ClamAV on apache.org Hi, http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html I followed the above instructions. But I get the following errors. Does anybody know what is wrong? Thanks. ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar hadoop-examples-*.jar grep input output 'dfs[a-z.]+' Warning: $HADOOP_HOME is deprecated. 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not loaded 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to process : 2 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001 13/06/26 09:49:14 ERROR security.UserGroupInformation: PriviledgedActionException as:py cause:java.io.IOException: Not a file: hdfs://localhost:9000/user/py/input/conf java.io.IOException: Not a file: hdfs://localhost:9000/user/py/input/conf at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043) at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886) at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323) at org.apache.hadoop.examples.Grep.run(Grep.java:69) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.examples.Grep.main(Grep.java:93) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) -- Regards, Peng