Return-Path: X-Original-To: apmail-hbase-dev-archive@www.apache.org Delivered-To: apmail-hbase-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1B8AF104A1 for ; Wed, 11 Sep 2013 20:42:58 +0000 (UTC) Received: (qmail 15946 invoked by uid 500); 11 Sep 2013 20:42:54 -0000 Delivered-To: apmail-hbase-dev-archive@hbase.apache.org Received: (qmail 15806 invoked by uid 500); 11 Sep 2013 20:42:53 -0000 Mailing-List: contact dev-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list dev@hbase.apache.org Received: (qmail 15450 invoked by uid 99); 11 Sep 2013 20:42:53 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 11 Sep 2013 20:42:53 +0000 Date: Wed, 11 Sep 2013 20:42:52 +0000 (UTC) From: "Himanshu Vashishtha (JIRA)" To: dev@hbase.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Created] (HBASE-9509) Fix HFile V1 Detector to handle AccessControlException for non-existant files MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 Himanshu Vashishtha created HBASE-9509: ------------------------------------------ Summary: Fix HFile V1 Detector to handle AccessControlException for non-existant files Key: HBASE-9509 URL: https://issues.apache.org/jira/browse/HBASE-9509 Project: HBase Issue Type: Bug Affects Versions: 0.96.0 Reporter: Himanshu Vashishtha Fix For: 0.98.0, 0.96.0 On some hadoop versions, fs.exists() throws an AccessControlException if there is a non-searchable inode in the file path. Versions such as 2.1.0-beta just returns false. This jira is to fix HFile V1 detector tool to avoid making such calls. See the below exception when running the tool on one hadoop version {code} ERROR util.HFileV1Detector: org.apache.hadoop.security.AccessControlException: Permission denied: user=hbase, access=EXECUTE, inode="/hbase/.META./.tableinfo.0000000001":hbase:supergroup:-rw-r--r-- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:234) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:187) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:150) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5141) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5123) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:5102) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3265) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:719) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:692) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59628) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2036) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2034) {code} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira