Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0210EE047 for ; Thu, 6 Dec 2012 14:59:09 +0000 (UTC) Received: (qmail 18238 invoked by uid 500); 6 Dec 2012 14:59:05 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 18127 invoked by uid 500); 6 Dec 2012 14:59:04 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 18116 invoked by uid 99); 6 Dec 2012 14:59:04 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 06 Dec 2012 14:59:04 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of write2kishore@gmail.com designates 209.85.219.48 as permitted sender) Received: from [209.85.219.48] (HELO mail-oa0-f48.google.com) (209.85.219.48) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 06 Dec 2012 14:58:56 +0000 Received: by mail-oa0-f48.google.com with SMTP id h2so7269541oag.35 for ; Thu, 06 Dec 2012 06:58:35 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=qP9AAPpJT34UhNVdH95Mop+wCjvFf3FJBWN2rh0pfAo=; b=eLR8X9HsSQQw5hWNoOg7zzlRrGmXX5j5/vp+bKCdF5AFR3CBtRPg2KglO20YwZjNcI gZYcAKzjjuGCxCh0/QPlalMHGmA8P4dCqA5+a871iZ/mQolrsmVOi0ZSHOgg7tBT4Tls 5WTDsrCaP3G0riTw3uCxYYw+AcwtzcsUJxGVdDec16zoZh/UgItLqES5ynOelaX+VLIM TCM4BoVBgazH4PIwgfkPVTYueFNImvXhe5CboL0IpYIkzNT4s2D9/peya9TKI64quZh4 /7x5X9rcutaiABs9F16Au5ZegMcGAs0bsRPDrPYu5UAxihazOFrIAlX/ePL5Fzm96C4n 222A== MIME-Version: 1.0 Received: by 10.60.10.227 with SMTP id l3mr982264oeb.119.1354805915172; Thu, 06 Dec 2012 06:58:35 -0800 (PST) Received: by 10.182.117.169 with HTTP; Thu, 6 Dec 2012 06:58:35 -0800 (PST) Date: Thu, 6 Dec 2012 20:28:35 +0530 Message-ID: Subject: Permission related errors when running with a different user From: Krishna Kishore Bonagiri To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=e89a8fb1f67450412a04d030557e X-Virus-Checked: Checked by ClamAV on apache.org --e89a8fb1f67450412a04d030557e Content-Type: text/plain; charset=ISO-8859-1 Hi, I am running a job with a different user than the one Hadoop is installed with, and getting the following error. Please help resolve it. This is actually an YARN job that I am trying to run. 2012-12-06 09:29:13,997 INFO Client (Client.java:prepareJarResource(293)) - Copy App Master jar from local filesystem and add to local environment 2012-12-06 09:29:14,476 FATAL Client (Client.java:main(148)) - Error running CLient org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/":kbonagir:supergroup:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4203) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4174) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1574) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1509) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:410) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688) at java.security.AccessController.doPrivileged(AccessController.java:284) at javax.security.auth.Subject.doAs(Subject.java:573) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:56) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:39) at java.lang.reflect.Constructor.newInstance(Constructor.java:527) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57) at org.apache.hadoop.hdfs.DFSOutputStream.(DFSOutputStream.java:1250) at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1090) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1048) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:785) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:684) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:259) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:232) at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1817) at Client.prepareJarResource(Client.java:299) at Client.launchAndMonitorAM(Client.java:509) at Client.main(Client.java:146) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37) at java.lang.reflect.Method.invoke(Method.java:611) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/":kbonagir:supergroup:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4203) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4174) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1574) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1509) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:410) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688) at java.security.AccessController.doPrivileged(AccessController.java:284) Thanks, Kishore --e89a8fb1f67450412a04d030557e Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hi,
=A0 I am running a job with a different user than the one Hadoop is= installed with, and getting the following error. Please help resolve it. T= his is actually an YARN job that I am trying to run.

2012-12-06 09:29:13,997 INFO =A0Client (Client.java:prepareJarRes= ource(293)) - Copy App Master jar from local filesystem and add to local en= vironment
2012-12-06 09:29:14,476 FATAL Client (Client.java:m= ain(148)) - Error running CLient
org.apache.hadoop.security.AccessControlException: Permission denied: = user=3Droot, access=3DWRITE, inode=3D"/":kbonagir:supergroup:drwx= r-xr-x
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.= FSPermissionChecker.check(FSPermissionChecker.java:205)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSPermission= Checker.check(FSPermissionChecker.java:186)
=A0 =A0 =A0 =A0 at or= g.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FS= PermissionChecker.java:135)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .checkPermission(FSNamesystem.java:4203)
=A0 =A0 =A0 =A0 at org.a= pache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesy= stem.java:4174)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .startFileInternal(FSNamesystem.java:1574)
=A0 =A0 =A0 =A0 at org= .apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.jav= a:1509)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcS= erver.create(NameNodeRpcServer.java:410)
=A0 =A0 =A0 =A0 at org.a= pache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.c= reate(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenod= eProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeP= rotocolProtos.java:42590)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ip= c.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:4= 27)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)=
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Se= rver.java:1692)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$H= andler$1.run(Server.java:1688)
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(AccessC= ontroller.java:284)
=A0 =A0 =A0 =A0 at javax.security.auth.Subjec= t.doAs(Subject.java:573)
=A0 =A0 =A0 =A0 at org.apache.hadoop.sec= urity.UserGroupInformation.doAs(UserGroupInformation.java:1232)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.jav= a:1686)

=A0 =A0 =A0 =A0 at sun.reflect.NativeConst= ructorAccessorImpl.newInstance0(Native Method)
=A0 =A0 =A0 =A0 at= sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAcc= essorImpl.java:56)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingConstructorAccessorImpl.newIn= stance(DelegatingConstructorAccessorImpl.java:39)
=A0 =A0 =A0 =A0= at java.lang.reflect.Constructor.newInstance(Constructor.java:527)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RemoteException.instantiateExce= ption(RemoteException.java:90)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RemoteException.unwrapRemoteE= xception(RemoteException.java:57)
=A0 =A0 =A0 =A0 at org.apache.h= adoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCre= ate(DFSOutputStream.java:1266)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.j= ava:1090)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.DFSClient.cre= ate(DFSClient.java:1048)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdf= s.DistributedFileSystem.create(DistributedFileSystem.java:232)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.DistributedFileSystem.create= (DistributedFileSystem.java:75)
=A0 =A0 =A0 =A0 at org.apache.had= oop.fs.FileSystem.create(FileSystem.java:804)
=A0 =A0 =A0 =A0 at = org.apache.hadoop.fs.FileSystem.create(FileSystem.java:785)
=A0 =A0 =A0 =A0 at org.apache.hadoop.fs.FileSystem.create(FileSystem.j= ava:684)
=A0 =A0 =A0 =A0 at org.apache.hadoop.fs.FileUtil.copy(Fi= leUtil.java:259)
=A0 =A0 =A0 =A0 at org.apache.hadoop.fs.FileUtil= .copy(FileUtil.java:232)
=A0 =A0 =A0 =A0 at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(F= ileSystem.java:1817)
=A0 =A0 =A0 =A0 at Client.prepareJarResource= (Client.java:299)
=A0 =A0 =A0 =A0 at Client.launchAndMonitorAM(Cl= ient.java:509)
=A0 =A0 =A0 =A0 at Client.main(Client.java:146)
=A0 =A0 =A0 = =A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=A0 =A0 =A0 =A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMeth= odAccessorImpl.java:60)
=A0 =A0 =A0 =A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Del= egatingMethodAccessorImpl.java:37)
=A0 =A0 =A0 =A0 at java.lang.r= eflect.Method.invoke(Method.java:611)
=A0 =A0 =A0 =A0 at org.apac= he.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: org.apache.hadoop.security.AccessControlException: Permissi= on denied: user=3Droot, access=3DWRITE, inode=3D"/":kbonagir:supe= rgroup:drwxr-xr-x
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.serve= r.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSPermission= Checker.check(FSPermissionChecker.java:186)
=A0 =A0 =A0 =A0 at or= g.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FS= PermissionChecker.java:135)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .checkPermission(FSNamesystem.java:4203)
=A0 =A0 =A0 =A0 at org.a= pache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesy= stem.java:4174)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .startFileInternal(FSNamesystem.java:1574)
=A0 =A0 =A0 =A0 at org= .apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.jav= a:1509)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcS= erver.create(NameNodeRpcServer.java:410)
=A0 =A0 =A0 =A0 at org.a= pache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.c= reate(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenod= eProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeP= rotocolProtos.java:42590)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ip= c.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:4= 27)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)=
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Se= rver.java:1692)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$H= andler$1.run(Server.java:1688)
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(AccessC= ontroller.java:284)


Thanks,
Kishore


--e89a8fb1f67450412a04d030557e--