Return-Path: Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: (qmail 1859 invoked from network); 13 Oct 2009 23:20:16 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 13 Oct 2009 23:20:16 -0000 Received: (qmail 76233 invoked by uid 500); 13 Oct 2009 23:20:15 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 76176 invoked by uid 500); 13 Oct 2009 23:20:15 -0000 Mailing-List: contact hdfs-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-user@hadoop.apache.org Delivered-To: mailing list hdfs-user@hadoop.apache.org Received: (qmail 76167 invoked by uid 99); 13 Oct 2009 23:20:15 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 13 Oct 2009 23:20:15 +0000 X-ASF-Spam-Status: No, hits=-2.6 required=5.0 tests=BAYES_00,HTML_MESSAGE X-Spam-Check-By: apache.org Received-SPF: neutral (athena.apache.org: local policy) Received: from [209.85.223.171] (HELO mail-iw0-f171.google.com) (209.85.223.171) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 13 Oct 2009 23:20:12 +0000 Received: by iwn1 with SMTP id 1so6029951iwn.2 for ; Tue, 13 Oct 2009 16:19:51 -0700 (PDT) MIME-Version: 1.0 Received: by 10.231.125.19 with SMTP id w19mr147664ibr.8.1255475991116; Tue, 13 Oct 2009 16:19:51 -0700 (PDT) Date: Tue, 13 Oct 2009 16:19:51 -0700 Message-ID: Subject: Re: Security error running hadoop with MaxTemperature example From: David Greer To: hdfs-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=0016e6475b20797d0c0475d94810 --0016e6475b20797d0c0475d94810 Content-Type: text/plain; charset=ISO-8859-1 Jason Venner writes ... >My first guess is that, for what ever reason, the user you are running the >job as, can not write the directory specified for the hadoop.tmp.dir in your >configuration. >This is usually in the system temporary area and not an issue. Assuming that files are in /tmp/hadoop-david, all seems to be fine: [david@tweety ~]$ cd /tmp/hadoop-david [david@tweety hadoop-david]$ mkdir foo [david@tweety hadoop-david]$ rmdir foo Using hadoop fs commands, all appears fine too: [david@tweety ~]$ hadoop fs -ls Found 1 items -rw-r--r-- 1 david supergroup 28081 2009-10-06 23:27 /user/david/docnotes.txt [david@tweety ~]$ hadoop fs -mkdir foo [david@tweety ~]$ hadoop fs -touchz foo/bar.txt [david@tweety ~]$ hadoop fs -rm foo/bar.txt Deleted hdfs://localhost/user/david/foo/bar.txt [david@tweety ~]$ hadoop fs -rmr foo Deleted hdfs://localhost/user/david/foo But trying to run MaxTemperature (this time with a valid input file and output directory) I still get an error: [david@tweety java]$ hadoop MaxTemperature sample.txt output 09/10/13 16:18:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 09/10/13 16:18:29 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. Exception in thread "main" org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x The error message includes "root". I wonder if this is a clue? I'm out of ideas. --0016e6475b20797d0c0475d94810 Content-Type: text/html; charset=ISO-8859-1 Jason Venner writes ...

>My first guess is that, for what ever reason, the user you are running the
>job as, can not write the directory specified for the hadoop.tmp.dir in your
>configuration.
>This is usually in the system temporary area and not an issue.


Assuming that files are in /tmp/hadoop-david, all seems to be fine:

[david@tweety ~]$ cd /tmp/hadoop-david
[david@tweety hadoop-david]$ mkdir foo
[david@tweety hadoop-david]$ rmdir foo

Using hadoop fs commands, all appears fine too:

[david@tweety ~]$ hadoop fs -ls
Found 1 items
-rw-r--r-- 1 david supergroup 28081 2009-10-06 23:27 /user/david/docnotes.txt
[david@tweety ~]$ hadoop fs -mkdir foo
[david@tweety ~]$ hadoop fs -touchz foo/bar.txt
[david@tweety ~]$ hadoop fs -rm foo/bar.txt
Deleted hdfs://localhost/user/david/foo/bar.txt
[david@tweety ~]$ hadoop fs -rmr foo
Deleted hdfs://localhost/user/david/foo

But trying to run MaxTemperature (this time with a valid input file and output directory) I still get an error:

[david@tweety java]$ hadoop MaxTemperature sample.txt output
09/10/13 16:18:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
09/10/13 16:18:29 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Exception in thread "main" org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

The error message includes "root". I wonder if this is a clue? I'm out of ideas.



--0016e6475b20797d0c0475d94810--