Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1D03E114F9 for ; Tue, 22 Jul 2014 06:15:25 +0000 (UTC) Received: (qmail 34516 invoked by uid 500); 22 Jul 2014 06:15:17 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 34397 invoked by uid 500); 22 Jul 2014 06:15:17 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 34379 invoked by uid 99); 22 Jul 2014 06:15:17 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 22 Jul 2014 06:15:17 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of justlooks@gmail.com designates 209.85.216.54 as permitted sender) Received: from [209.85.216.54] (HELO mail-qa0-f54.google.com) (209.85.216.54) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 22 Jul 2014 06:15:11 +0000 Received: by mail-qa0-f54.google.com with SMTP id k15so6095295qaq.13 for ; Mon, 21 Jul 2014 23:14:51 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=P8fFSMtcpIxynq5KVL7tufxsPHqM0zksWdgt+bscDp4=; b=xlo/7kLfRMdJGSJW/Hj26XnqdBoAuSxrmAD+Qca6cfF6chIPhaSu+cyIMLooItjMV0 pxETbK6Czj8z+AnNRq5S2HhT18GRyr/WypsWy2OaHwAngbfS1bIkyUfCwPa4TvQn788V xrOL41Pq8B3OziTc0U9EkcjnNafXKciby3XyV5loNQqn3tTA0WPaliV1wN6K8vAi6cM3 6LikiBhZ9IxXSkSTU89gBg+uU8RJQ8fNfq73qWelrLXm880mbTIVTVG6W1N2QHgku0EW 55zYlbLICpas3nHweXLEF5ZZlgD2STrGW2d+hnWAUDN7K8PhqbJwQoB8oTC91PatQDkP qZkQ== MIME-Version: 1.0 X-Received: by 10.140.100.233 with SMTP id s96mr46948958qge.92.1406009690981; Mon, 21 Jul 2014 23:14:50 -0700 (PDT) Received: by 10.140.102.71 with HTTP; Mon, 21 Jul 2014 23:14:50 -0700 (PDT) Date: Tue, 22 Jul 2014 14:14:50 +0800 Message-ID: Subject: issue about run MR job use system user in CDH5 From: ch huang To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a1134f59c2e8bbb04fec224a7 X-Virus-Checked: Checked by ClamAV on apache.org --001a1134f59c2e8bbb04fec224a7 Content-Type: text/plain; charset=UTF-8 hi,maillist: i set up CDH5 yarn cluster ,and set the following option in my mapred-site.xml file yarn.app.mapreduce.am.staging-dir /data mapreduce history server will set history dir in the directory /data ,but if i submit MR job use other user ,i get error , i add the user to hadoop group also no use ,why?how can i do it? thanks 2014-07-22 14:07:06,734 INFO [main] mapreduce.TableOutputFormat: Created table instance for test_1 2014-07-22 14:07:06,765 WARN [main] security.UserGroupInformation: PriviledgedActionException as:hbase (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx--- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980) Exception in thread "main" org.apache.hadoop.security.AccessControlException: Permission denied: user=hbase, access=EXECUTE, inode="/data":mapred:hadoop:drwxrwx--- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3499) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980) --001a1134f59c2e8bbb04fec224a7 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
hi,maillist:

=C2=A0 =C2=A0 =C2=A0 =C2= =A0 i set up CDH5 yarn cluster ,and set the following option in my mapred-s= ite.xml file

=C2=A0 =C2=A0 =C2=A0 =C2=A0 <= property>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0<name>yarn.a= pp.mapreduce.am.staging-dir</name>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0<value>/data</value>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 </property>


mapreduce history server will set history dir in th= e directory /data ,but if i submit MR job use other user ,i get error , i a= dd the user to hadoop group also no use ,why?how can i do it? thanks

2014-07-22 14:07:06,734 INFO =C2=A0[main] mapreduc= e.TableOutputFormat: Created table instance for test_1
2014-07-22= 14:07:06,765 WARN =C2=A0[main] security.UserGroupInformation: PriviledgedA= ctionException as:hbase (auth:SIMPLE) cause:org.apache.hadoop.security.Acce= ssControlException: Permission denied: user=3Dhbase, access=3DEXECUTE, inod= e=3D"/data":mapred:hadoop:drwxrwx---
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.FS= PermissionChecker.check(FSPermissionChecker.java:251)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSPer= missionChecker.checkPermission(FSPermissionChecker.java:168)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= FSNamesystem.checkPermission(FSNamesystem.java:5490)
=C2=A0 =C2= =A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.ge= tFileInfo(FSNamesystem.java:3499)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodePro= tocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTra= nslatorPB.java:764)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.protocol.proto.C= lientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(Cli= entNamenodeProtocolProtos.java)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at or= g.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Protob= ufRpcEngine.java:585)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(R= PC.java:1026)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ip= c.Server$Handler$1.run(Server.java:1986)
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.security.AccessController.doPrivil= eged(Native Method)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at javax.security= .auth.Subject.doAs(Subject.java:415)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 = at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio= n.java:1548)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ipc.Server$Handler.ru= n(Server.java:1980)

Exception in thread "main= " org.apache.hadoop.security.AccessControlException: Permission denied= : user=3Dhbase, access=3DEXECUTE, inode=3D"/data":mapred:hadoop:d= rwxrwx---
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.FS= PermissionChecker.check(FSPermissionChecker.java:251)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSPer= missionChecker.checkPermission(FSPermissionChecker.java:168)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= FSNamesystem.checkPermission(FSNamesystem.java:5490)
=C2=A0 =C2= =A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.ge= tFileInfo(FSNamesystem.java:3499)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.server.namenode.= NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:764)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodePro= tocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTra= nslatorPB.java:764)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.protocol.proto.C= lientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(Cli= entNamenodeProtocolProtos.java)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at or= g.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Protob= ufRpcEngine.java:585)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(R= PC.java:1026)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ip= c.Server$Handler$1.run(Server.java:1986)
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.security.AccessController.doPrivil= eged(Native Method)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at javax.security= .auth.Subject.doAs(Subject.java:415)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 = at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio= n.java:1548)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ipc.Server$Handler.ru= n(Server.java:1980)


--001a1134f59c2e8bbb04fec224a7--