Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 8F332200BB9 for ; Mon, 7 Nov 2016 23:04:00 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 8DDA4160B15; Mon, 7 Nov 2016 22:04:00 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id DF45F160AEC for ; Mon, 7 Nov 2016 23:03:59 +0100 (CET) Received: (qmail 34477 invoked by uid 500); 7 Nov 2016 22:03:59 -0000 Mailing-List: contact issues-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@hbase.apache.org Received: (qmail 34199 invoked by uid 99); 7 Nov 2016 22:03:58 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 07 Nov 2016 22:03:58 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id A20602C1F56 for ; Mon, 7 Nov 2016 22:03:58 +0000 (UTC) Date: Mon, 7 Nov 2016 22:03:58 +0000 (UTC) From: "Binzi Cao (JIRA)" To: issues@hbase.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HBASE-17040) HBase Spark does not work in Kerberos and yarn-master mode MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Mon, 07 Nov 2016 22:04:00 -0000 [ https://issues.apache.org/jira/browse/HBASE-17040?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15645605#comment-15645605 ] Binzi Cao commented on HBASE-17040: ----------------------------------- My `spark-sbumit` command is as below: ` HADOOP_USER_NAME=spark spark-submit \ --jars "local:///opt/cloudera/parcels/CDH/jars/hbase-spark-1.2.0-cdh5.8.2.jar" \ --driver-memory 5G \ --executor-memory 5G \ --num-executors 5 \ --deploy-mode cluster \ --files "my_application.conf,hbase.keytab" \ --class "MySparkApp" --master yarn \ --driver-java-options "-Dconfig.file=my_application.conf" \ MyHBaseApp.jar ` I tried two different ways to start the job: 1. kinit first with a user with spark and hbase permissions. Spark job can be started successfully but will fail to create `HBaseContext` with above exceptions. The submit command works if I change the `deploy-mode` to `client`. 2. Pass the keytab file to job and load it in code, the principal/keytab file can be loaded properly, but the `HBaseContext` could not be created as it always use the current user of the job instead of the keytab credentials > HBase Spark does not work in Kerberos and yarn-master mode > ---------------------------------------------------------- > > Key: HBASE-17040 > URL: https://issues.apache.org/jira/browse/HBASE-17040 > Project: HBase > Issue Type: Bug > Components: spark > Affects Versions: 2.0.0 > Environment: HBase > Kerberos > Yarn > Cloudera > Reporter: Binzi Cao > > We are loading hbase records to RDD with the hbase-spark library in Cloudera. > The hbase-spark code works if we submit the job with client mode, but does not work in cluster mode. We got below exceptions: > ``` > 16/11/07 05:43:28 WARN security.UserGroupInformation: PriviledgedActionException as:spark (auth:SIMPLE) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] > 16/11/07 05:43:28 WARN ipc.RpcClientImpl: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] > 16/11/07 05:43:28 ERROR ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. > javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] > at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) > at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906) > at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873) > at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242) > at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) > at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) > at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:34118) > at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1627) > at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92) > at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89) > at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) > at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95) > at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callBlockingMethod(CoprocessorRpcChannel.java:73) > at org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512) > at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:86) > at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:111) > at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:108) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) > at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:340) > at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:108) > at org.apache.hadoop.hbase.security.token.TokenUtil.addTokenForJob(TokenUtil.java:329) > at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initCredentials(TableMapReduceUtil.java:490) > at org.apache.hadoop.hbase.spark.HBaseContext.(HBaseContext.scala:70) > ``` -- This message was sent by Atlassian JIRA (v6.3.4#6332)