hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ben Hardy <benha...@gmail.com>
Subject Re: Access Error
Date Wed, 06 Oct 2010 22:54:27 GMT
I'm getting this same error when running hadoop jobs from a remote client,
but it works fine on the master.

It looks like an HDFS permission issue, but HDFS file operations from the
remote client also work just fine. Very odd!

Error doesn't happen when running jobs from the master. Using CDH 3 beta.

It seems to be complaining about my user not being able to write to a
directory that is owned and writable by my user, unless I'm misreading the
error.

Example:

# copyFromLocal works...
$ export HDFS_ROOT=hdfs://hadoop0001:54310
$ hadoop fs -copyFromLocal run.counter $HDFS_ROOT:/user/bhardy/
# no complaints

# normal hadoop jobs don't
$ hadoop jar Processor-jobjar-1.7.jar   com.mycorp.ProcessorJob \
>   $HDFS_ROOT/user/bhardy/pdp-intermediate \
>   $HDFS_ROOT/user/bhardy/pdp-intermediate2
10/10/06 15:08:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
Exception in thread "main"
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=bhardy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)
at
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:264)
at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:573)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:730)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1249)
at
com.eharmony.matching.offline.pdp.phase2.PdpPhase2Job.run(PdpPhase2Job.java:67)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at
com.eharmony.matching.offline.pdp.phase2.PdpPhase2Job.main(PdpPhase2Job.java:85)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=bhardy, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)


The client has the following properties set, and the cluster is pretty much
using the defaults.

 fs.default.name=hdfs://hadoop0001:54310/
 dfs.datanode.address=hdfs://hadoop0001:54310/
 mapred.job.tracker=hadoop0002:54311

My question is - where would I start even looking to try to diagnose this
exception?

In my hdfs-site.xml, dfs.permissions is not explicitly set to anything. In
any case if it were defaulting to true, that'd be fine. I can run whoami
with no problem.

I'm confused as to why I get AccessExceptions for hadoop jobs but not dfs
operations when running from the remote client. Hadoop jobs work fine when
run from the master. I'm sure this is some trivial configuration problem.
Any suggestions?

thanks
b


On Thu, Nov 19, 2009 at 3:59 PM, Y G <gymitat@gmail.com> wrote:

> you can run you MR programm in the relative *nix account, make sure it
>  is as same as your hdfs dir user and group .
> or you can trun off the hdfs permission conf .
>
> 2009/11/20, Ananth T. Sarathy <ananth.t.sarathy@gmail.com>:
> > I just set up a hadoop cluster. When I try to write to it from my java
> code,
> > I get the.error below. When using the core-site.xml, do I need to specify
> a
> > user?
> >
> >
> >
> > org.apache.hadoop.security.AccessControlException:
> > org.apache.hadoop.security.AccessControlException: Permission denied:
> > user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
> >     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> >     at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> >     at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> >     at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >     at
> >
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
> >     at
> >
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
> >     at
> >
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2647)
> >     at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:463)
> >     at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:479)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:460)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:367)
> >     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:359)
> >     at
> >
> com.iswcorp.hadoop.HadoopDataManagerImpl.writeToFile(HadoopDataManagerImpl.java:151)
> >     at
> >
> com.iswcorp.hadoop.HadoopDataManagerImpl.main(HadoopDataManagerImpl.java:46)
> > Caused by: org.apache.hadoop.ipc.RemoteException:
> > org.apache.hadoop.security.AccessControlException: Permission denied:
> > user=DrWho, access=WRITE, inode="":root:supergroup:rwxr-xr-x
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:105)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4545)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4515)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1023)
> >     at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:977)
> >     at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:389)
> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >     at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >     at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >     at java.lang.reflect.Method.invoke(Method.java:597)
> >     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> >     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
> >     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:396)
> >     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
> >
> >     at org.apache.hadoop.ipc.Client.call(Client.java:739)
> >     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> >     at $Proxy0.create(Unknown Source)
> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >     at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >     at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >     at java.lang.reflect.Method.invoke(Method.java:597)
> >     at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >     at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >     at $Proxy0.create(Unknown Source)
> >     at
> >
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2644)
> >     ... 8 more
> > Ananth T Sarathy
> >
>
> --
> 从我的移动设备发送
>
> -----
> 天天开心
> 身体健康
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message