hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Greer <da...@davidgreer.ca>
Subject Security error running hadoop with MaxTemperature example
Date Mon, 12 Oct 2009 16:52:18 GMT
Hi Everyone,

I'm trying to get my first MapReduce example to work. Background:

RedHat ES 5.2
Sun Java 1.6.0_16-b01
 Hadoop 0.20.1+133 (Cloudera distro)

I've started the hadoop daemons, created an HDFS locally, and checked that
basic operations in HDFS appear to work.

I'm trying to get the first most basic example from Tom White's book "Hadoop
The Definitive Guide" to work. I'm running hadoop in pseudo-mode. I'm using
a generic user. Basic hadoop commands appear to work:

hadoop fs -ls
Found 1 items
-rw-r--r--   1 david supergroup      28081 2009-10-06 23:27
/user/david/docnotes.txt

I compiled the examples in chapter 2 "by hand" (why is a separate thread). I
then try and see if I can invoke MaxTemperature with non-existent files (at
this point I'm just trying to see if we can get everything to load and
initialized):

export HADOOP_CLASSPATH="./"
hadoop MaxTemperature foo bar

I get the error message:

Exception in thread "main"
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

There's a long stack trace the start of which looks like:

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
        at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:909)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1162)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:306)
        .........

I'm out of ideas at this point. Any suggestions for where I should look to
solve this?

Cheers,

David

Mime
View raw message