hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mulcahy, Stephen" <stephen.mulc...@deri.org>
Subject RE: Permissions needed to run RandomWriter ?
Date Fri, 26 Jun 2009 20:29:08 GMT
[Apologies for the top-post, sending this from a dodgy webmail client]

Hi Alex,

My hadoop-site.xml is as follows,

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
   <property>
     <name>mapred.job.tracker</name>
    <value>hadoop01:9001</value>
   </property>

  <property>
     <name>fs.default.name</name>
   <value>hdfs://hadoop01:9000</value>
   </property>

   <property>
     <name>hadoop.tmp.dir</name>
     <value>/data1/hadoop-tmp/</value>
   </property>

    <property>
      <name>dfs.data.dir</name>
      <value>/data1/hdfs,/data2/hdfs</value>
    </property>
</configuration>

Any comments welcome,

-stephen



-----Original Message-----
From: Alex Loddengaard [mailto:alex@cloudera.com]
Sent: Fri 26/06/2009 18:32
To: core-user@hadoop.apache.org
Subject: Re: Permissions needed to run RandomWriter ?
 
Hey Stephen,

What does your hadoop-site.xml look like?  The Exception is in
java.io.UnixFileSystem, which makes me think that you're actually creating
and modifying directories on your local file system instead of HDFS.  Make
sure "fs.default.name" looks like "hdfs://your-namenode.domain.com:PORT".

Alex

On Fri, Jun 26, 2009 at 4:40 AM, stephen mulcahy
<stephen.mulcahy@deri.org>wrote:

> Hi,
>
> I've just installed a new test cluster and I'm trying to give it a quick
> smoke test with RandomWriter and Sort.
>
> I can run these fine with the superuser account. When I try to run them as
> another user I run into problems even though I've created the output
> directory and given permissions to the other user to write to this
> directory. i.e.
>
> 1. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo
> mkdir: org.apache.hadoop.fs.permission.AccessControlException: Permission
> denied: user=smulcahy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
>
> OK - we don't have permissions anyways
>
> 2. hadoop@hadoop01:/$ hadoop fs -mkdir /foo
>
> OK
>
> 3. hadoop fs -chown -R smulcahy /foo
>
> OK
>
> 4. smulcahy@hadoop01:~$ hadoop fs -mkdir /foo/test
>
> OK
>
> 5. smulcahy@hadoop01:~$ hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar
> randomwriter /foo
> java.io.IOException: Permission denied
>        at java.io.UnixFileSystem.createFileExclusively(Native Method)
>        at java.io.File.checkAndCreate(File.java:1704)
>        at java.io.File.createTempFile(File.java:1793)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
>        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
> Any suggestions on why step 5. is failing even though I have write
> permissions to /foo - do I need permissions on some other directory also or
> ... ?
>
> Thanks,
>
> -stephen
>
> --
> Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
> NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
> http://di2.deri.ie    http://webstar.deri.ie    http://sindice.com
>


Mime
View raw message