Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 88506200C0E for ; Wed, 1 Feb 2017 16:30:01 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 86A1A160B44; Wed, 1 Feb 2017 15:30:01 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id AC217160B43 for ; Wed, 1 Feb 2017 16:29:58 +0100 (CET) Received: (qmail 5479 invoked by uid 500); 1 Feb 2017 15:29:57 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 5468 invoked by uid 99); 1 Feb 2017 15:29:57 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 01 Feb 2017 15:29:57 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id ACDBF180371 for ; Wed, 1 Feb 2017 15:29:56 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.683 X-Spam-Level: * X-Spam-Status: No, score=1.683 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, HTML_OBFUSCATE_05_10=0.001, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001, URIBL_BLOCKED=0.001, URI_TRY_3LD=0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id FOaTlA-RsM6U for ; Wed, 1 Feb 2017 15:29:46 +0000 (UTC) Received: from mail-lf0-f46.google.com (mail-lf0-f46.google.com [209.85.215.46]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id 6E4475F30D for ; Wed, 1 Feb 2017 15:29:45 +0000 (UTC) Received: by mail-lf0-f46.google.com with SMTP id x1so143647900lff.0 for ; Wed, 01 Feb 2017 07:29:45 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=u2ydtdwTQd388zjUlLUzeYBnt1pNXxvhi4i+8qw0YLw=; b=uPR2BBgI8qYb3ddcxHJtkomDd1jsptQ5vDbRs5N3comSbTBCaW7pADYJtnFzRRwi5v mjFc5eHTmVxGw+Sgc4/AY4UxLHjokEjvV/5WYbH5BZGOrME9Dz+Ajui+C5VKxRrvdisE fBCPHBul4VRc9/7J7QK+FiIFsyVaMXjj1pKWfBDSMDE8BnHQhoX54kT4pBqq2hBCCamk zoAH4VJ0IQWLgONFF/Cwd61tC+G0OkurC/dpkMX9OBBZVAp72V5Fh9BsI5mhTmdHann4 tAnAzMERsTTIMrgjWXjj8haz/tRynLRUBvcRErcv/rBuwq8wZScNVZpZckC7GucEDzo7 MgzQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=u2ydtdwTQd388zjUlLUzeYBnt1pNXxvhi4i+8qw0YLw=; b=Y3FyRvtTL8jk8i4GYQqk9O9OAY5VAZFmKc76CJOVAOaCFAhrNLBA3yqC9aW9qpHQ/l HGf/jb3iX6qFl8PtZifaiPNZ7wNsAiSiQBPBr/4tIl9JhL5wuIjEhyvn6Hldvp0U3bS6 VJyZDCHMLc+fRryOoC5Gsdr+O7bEwHiyI31FEjBcf0208t4H1900sMjzQe7wltoXh+DR 4POGrnmH4DHLzHHSGGt2a7i5s2cgbdnmB7F9A5nLCcIihQYrdACNIBjB6hJ68Z9BYUHq xCN8EghD1GK4GzFjklwkWZqFPYFRap5rny1j7tvpyavgAi10PCBTkc1C/SfZQFBNLuAs wH/Q== X-Gm-Message-State: AIkVDXJWAZS75kZn/hdkLlE/vHpe2sIkNrjNFk2wgqzE4DMZs9u/Tw3W6fJxTVc4z2IM0BLoV95cRwnnZoYITA== X-Received: by 10.46.21.66 with SMTP id 2mr1440705ljv.19.1485962981292; Wed, 01 Feb 2017 07:29:41 -0800 (PST) MIME-Version: 1.0 Received: by 10.25.76.2 with HTTP; Wed, 1 Feb 2017 07:29:40 -0800 (PST) In-Reply-To: <8c7a306f5189454296087adf247cd04a@DM2PR79MB013.MGDADSK.autodesk.com> References: <48d73cbd880c4a559a5a9de9cd30bba4@DM2PR79MB013.MGDADSK.autodesk.com> <3b7926b853fe4d36ae95d92f9ce72694@DM2PR79MB013.MGDADSK.autodesk.com> <62dbe91e397b4d5cbcf09a80c824caa4@DM2PR79MB013.MGDADSK.autodesk.com> <208dc2ac032d4520a4fd9ab78d7f4dbd@DM2PR79MB013.MGDADSK.autodesk.com> <14897398ea834e468afef840e8172b61@DM2PR79MB013.MGDADSK.autodesk.com> <8c7a306f5189454296087adf247cd04a@DM2PR79MB013.MGDADSK.autodesk.com> From: Vivek Shrivastava Date: Wed, 1 Feb 2017 10:29:40 -0500 Message-ID: Subject: Re: Pls Help me - Hive Kerberos Issue To: user@hive.apache.org Content-Type: multipart/alternative; boundary=f403045f7320a64142054779b7d6 archived-at: Wed, 01 Feb 2017 15:30:01 -0000 --f403045f7320a64142054779b7d6 Content-Type: text/plain; charset=UTF-8 The attached file does not look like hive-site.xml. What is the value of hive.server2.authentication in hive-site.xml. Also your sasl. qop value should be one of three values, not all three. On Mon, Jan 30, 2017 at 4:28 PM, Ricardo Fajardo < ricardo.fajardo@autodesk.com> wrote: > Attached the hive-site.xml configuration file. > ------------------------------ > *From:* Vivek Shrivastava > *Sent:* Monday, January 30, 2017 4:10:42 PM > > *To:* user@hive.apache.org > *Subject:* Re: Pls Help me - Hive Kerberos Issue > > If this is working then your kerberos setup is ok. I suspect configuration > is Hiveserver2. What is the authentication and security setup in Hive > config? Please see if you can attach it. > > On Mon, Jan 30, 2017 at 2:33 PM, Ricardo Fajardo < > ricardo.fajardo@autodesk.com> wrote: > >> [cloudera@quickstart bin]$ >> [cloudera@quickstart bin]$ hadoop fs -ls >> Java config name: null >> Native config name: /etc/krb5.conf >> Loaded from native config >> Found 20 items >> drwxr-xr-x - cloudera cloudera 0 2016-06-13 17:51 checkpoint >> -rw-r--r-- 1 cloudera cloudera 3249 2016-05-11 16:19 hadoop.txt >> drwxr-xr-x - cloudera cloudera 0 2016-06-02 16:15 hadoop2.txt >> drwxr-xr-x - cloudera cloudera 0 2016-06-02 16:30 hadoop3.txt >> drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives >> drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:06 out1 >> -rw-r--r-- 1 cloudera cloudera 3868 2016-06-15 08:39 >> post.small0.xml >> drwxr-xr-x - cloudera cloudera 0 2016-07-14 17:01 tCount1 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 15:57 test1 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:57 test10 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 17:33 test12 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:02 test2 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:24 test3 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:27 test4 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:32 test5 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:37 test6 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:49 test7 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:51 test8 >> drwxr-xr-x - cloudera cloudera 0 2016-06-21 16:54 test9 >> -rw-r--r-- 1 cloudera cloudera 8481022 2016-06-08 21:51 train.tsv >> [cloudera@quickstart bin]$ >> [cloudera@quickstart bin]$ >> [cloudera@quickstart bin]$ >> [cloudera@quickstart bin]$ echo $HADOOP_OPTS >> -Dsun.security.krb5.debug=true >> [cloudera@quickstart bin]$ >> >> ------------------------------ >> *From:* Vivek Shrivastava >> *Sent:* Monday, January 30, 2017 2:28:53 PM >> >> *To:* user@hive.apache.org >> *Subject:* Re: Pls Help me - Hive Kerberos Issue >> >> If you are using AES256, then please do update java unlimited strength >> jar files. What is the output of hadoop ls command after exporting the >> below environment variable? >> >> export HADOOP_OPTS="-Dsun.security.krb5.debug=true" >> hadoop fs -ls / >> >> On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo < >> ricardo.fajardo@autodesk.com> wrote: >> >>> I did the changes but I am getting the same error. >>> >>> Klist: >>> >>> [cloudera@quickstart bin]$ klist -fe >>> Ticket cache: FILE:/tmp/krb5cc_501 >>> Default principal: t_fajar@ADS.AUTODESK.COM >>> >>> Valid starting Expires Service principal >>> 01/30/17 11:56:20 01/30/17 21:56:24 krbtgt/ADS.AUTODESK.COM@ADS.A >>> UTODESK.COM >>> renew until 01/31/17 11:56:20, Flags: FPRIA >>> Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac >>> >>> >>> Log: >>> >>> [cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.kr >>> b5.debug=true" >>> [cloudera@quickstart bin]$ >>> [cloudera@quickstart bin]$ >>> [cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/ >>> default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.p >>> roxy.user=t_fajar" >>> /home/cloudera/workspace/hive/bin/hive: line 99: [: >>> /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar: >>> binary operator expected >>> SLF4J: Class path contains multiple SLF4J bindings. >>> SLF4J: Found binding in [jar:file:/home/cloudera/works >>> pace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class] >>> SLF4J: Found binding in [jar:file:/home/cloudera/works >>> pace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/s >>> lf4j/impl/StaticLoggerBinder.class] >>> SLF4J: Found binding in [jar:file:/home/cloudera/works >>> pace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4 >>> j/impl/StaticLoggerBinder.class] >>> SLF4J: Found binding in [jar:file:/home/cloudera/works >>> pace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4 >>> j/impl/StaticLoggerBinder.class] >>> SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/l >>> ib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >>> explanation. >>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_ >>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar >>> Java config name: null >>> Native config name: /etc/krb5.conf >>> Loaded from native config >>> 17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL >>> negotiation failure >>> javax.security.sasl.SaslException: GSS initiate failed >>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) >>> ~[?:1.8.0_73] >>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS >>> tartMessage(TSaslClientTransport.java:94) ~[benchmarks.jar:2.2.0-SNAPSHO >>> T] >>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) >>> [benchmarks.jar:2.2.0-SNAPSHOT] >>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) >>> [benchmarks.jar:2.2.0-SNAPSHOT] >>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1 >>> .run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT] >>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1 >>> .run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT] >>> at java.security.AccessController.doPrivileged(Native Method) >>> ~[?:1.8.0_73] >>> at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73] >>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) >>> [benchmarks.jar:2.2.0-SNAPSHOT] >>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o >>> pen(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) >>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:182) >>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) >>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at java.sql.DriverManager.getConnection(DriverManager.java:664) >>> [?:1.8.0_73] >>> at java.sql.DriverManager.getConnection(DriverManager.java:208) >>> [?:1.8.0_73] >>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> ~[?:1.8.0_73] >>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >>> ~[?:1.8.0_73] >>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>> ~[?:1.8.0_73] >>> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73] >>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) >>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> ~[?:1.8.0_73] >>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >>> ~[?:1.8.0_73] >>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>> ~[?:1.8.0_73] >>> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73] >>> at org.apache.hadoop.util.RunJar.run(RunJar.java:221) >>> [benchmarks.jar:2.2.0-SNAPSHOT] >>> at org.apache.hadoop.util.RunJar.main(RunJar.java:136) >>> [benchmarks.jar:2.2.0-SNAPSHOT] >>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided >>> (Mechanism level: Failed to find any Kerberos tgt) >>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) >>> ~[?:1.8.0_73] >>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) >>> ~[?:1.8.0_73] >>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) >>> ~[?:1.8.0_73] >>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) >>> ~[?:1.8.0_73] >>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) >>> ~[?:1.8.0_73] >>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) >>> ~[?:1.8.0_73] >>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) >>> ~[?:1.8.0_73] >>> ... 35 more >>> 17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to >>> localhost:10000 >>> HS2 may be unavailable, check server status >>> Error: Could not open client transport with JDBC Uri: >>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD >>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed >>> (state=08S01,code=0) >>> Beeline version 2.2.0-SNAPSHOT by Apache Hive >>> beeline> >>> >>> >>> ------------------------------ >>> *From:* Vivek Shrivastava >>> *Sent:* Monday, January 30, 2017 11:34:27 AM >>> >>> *To:* user@hive.apache.org >>> *Subject:* Re: Pls Help me - Hive Kerberos Issue >>> >>> You can comment both default_tkt_enctypes and default_tgs_enctypes out, >>> the default value will become aes256-cts-hmac-sha1-96AES128-CTS-HMAC-SHA1-96 >>> des3-c BC-SHA1 arcfour-HMAC-MD5 camel lia256 CTS-CMAC camellia128-CT with >>> CMAC- des-cbc-crc des-cbc-md5 des-cbc-MD4 . >>> Then do >>> kdestroy >>> kinit >>> klist -fev >>> your beeline command >>> >>> if still does not work then paste the output of >>> >>> export HADOOP_OPTS="-Dsun.security.krb5.debug=true" >>> hadoop fs -ls / >>> >>> >>> >>> On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo < >>> ricardo.fajardo@autodesk.com> wrote: >>> >>>> I don't have any particular reason for selecting arcfour encryption >>>> type. If I need to change it and it will work I can do. >>>> >>>> Values from krb5.conf: >>>> >>>> [Libdefaults] >>>> default_realm = ADS.AUTODESK.COM >>>> krb4_config = /etc/krb.conf >>>> krb4_realms = /etc/krb.realms >>>> kdc_timesync = 1 >>>> ccache_type = 4 >>>> forwardable = true >>>> proxiable = true >>>> v4_instance_resolve = false >>>> v4_name_convert = { >>>> host = { >>>> rcmd = host >>>> ftp = ftp >>>> } >>>> plain = { >>>> something = something-else >>>> } >>>> } >>>> fcc-mit-ticketflags = true >>>> default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 >>>> AES256-CTS >>>> default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 >>>> AES256-CTS >>>> >>>> [realms] >>>> >>>> ADS.AUTODESK.COM = { >>>> kdc = krb.ads.autodesk.com: 88 >>>> admin_server = krb.ads.autodesk.com >>>> default_domain = ads.autodesk.com >>>> database_module = openldap_ldapconf >>>> master_key_type = aes256-cts >>>> supported_enctypes = aes256-cts:normal >>>> aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal >>>> des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal >>>> default_principal_flags = +preauth >>>> } >>>> >>>> Thanks so much for your help, >>>> Richard. >>>> ------------------------------ >>>> *From:* Vivek Shrivastava >>>> *Sent:* Monday, January 30, 2017 11:01:24 AM >>>> >>>> *To:* user@hive.apache.org >>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue >>>> >>>> Any particular reason for selecting arcfour encryption type? Could you >>>> please post defaults (e.g enc_type) values from krb5.conf >>>> >>>> On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo < >>>> ricardo.fajardo@autodesk.com> wrote: >>>> >>>>> >>>>> 1. klist -fe >>>>> >>>>> [cloudera@quickstart bin]$ klist -fe >>>>> Ticket cache: FILE:/tmp/krb5cc_501 >>>>> Default principal: t_fajar@ADS.AUTODESK.COM >>>>> >>>>> Valid starting Expires Service principal >>>>> 01/30/17 10:52:37 01/30/17 20:52:43 krbtgt/ADS.AUTODESK.COM@ADS.A >>>>> UTODESK.COM >>>>> renew until 01/31/17 10:52:37, Flags: FPRIA >>>>> Etype (skey, tkt): arcfour-hmac, arcfour-hmac >>>>> [cloudera@quickstart bin]$ >>>>> >>>>> 2. relevant entries from HiveServer2 log >>>>> >>>>> >>>>> beeline> !connect jdbc:hive2://localhost:10000/d >>>>> efault;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.pr >>>>> oxy.user=t_fajar >>>>> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD >>>>> S. >>>>> AUTODESK.COM;hive.server2.proxy.user=t_fajar >>>>> SLF4J: Class path contains multiple SLF4J bindings. >>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r >>>>> epository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/lo >>>>> g4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] >>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r >>>>> epository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1. >>>>> jar!/org/slf4j/impl/StaticLoggerBinder.class] >>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r >>>>> epository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.1 >>>>> 0.jar!/org/slf4j/impl/StaticLoggerBinder.class] >>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >>>>> explanation. >>>>> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4 >>>>> jLoggerFactory] >>>>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_ >>>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar >>>>> 17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000 >>>>> 17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000 >>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field >>>>> org.apache.hadoop.metrics2.lib.MutableRate >>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess >>>>> with annotation @org.apache.hadoop.metrics2.an >>>>> notation.Metric(valueName=Time, value=[Rate of successful kerberos >>>>> logins and latency (milliseconds)], about=, type=DEFAULT, always=false, >>>>> sampleName=Ops) >>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field >>>>> org.apache.hadoop.metrics2.lib.MutableRate >>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure >>>>> with annotation @org.apache.hadoop.metrics2.an >>>>> notation.Metric(valueName=Time, value=[Rate of failed kerberos logins >>>>> and latency (milliseconds)], about=, type=DEFAULT, always=false, >>>>> sampleName=Ops) >>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field >>>>> org.apache.hadoop.metrics2.lib.MutableRate >>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups >>>>> with annotation @org.apache.hadoop.metrics2.an >>>>> notation.Metric(valueName=Time, value=[GetGroups], about=, >>>>> type=DEFAULT, always=false, sampleName=Ops) >>>>> 17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group >>>>> related metrics >>>>> 17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0 >>>>> 17/01/27 16:16:37 DEBUG Groups: Creating new Groups object >>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the >>>>> custom-built native-hadoop library... >>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop >>>>> with error: java.lang.UnsatisfiedLinkError: no hadoop in >>>>> java.library.path >>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: >>>>> java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/l >>>>> ib64:/lib:/usr/lib >>>>> 17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop >>>>> library for your platform... using builtin-java classes where applicable >>>>> 17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell >>>>> based >>>>> 17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group >>>>> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping >>>>> 17/01/27 16:16:38 DEBUG Groups: Group mapping >>>>> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; >>>>> cacheTimeout=300000; warningDeltaMs=5000 >>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login >>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit >>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: using local >>>>> user:UnixPrincipal: cloudera >>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: >>>>> "UnixPrincipal: cloudera" with name cloudera >>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera" >>>>> 17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera >>>>> (auth:SIMPLE) >>>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = >>>>> SIMPLE >>>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as >>>>> passed-in authMethod of kerberos != current. >>>>> 17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction >>>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th >>>>> rift.HadoopThriftAuthBridge$Client.createClientTransport(Had >>>>> oopThriftAuthBridge.java:208) >>>>> 17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction >>>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th >>>>> rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) >>>>> 17/01/30 10:55:02 DEBUG TSaslTransport: opening transport >>>>> org.apache.thrift.transport.TSaslClientTransport@1119f7c5 >>>>> 17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure >>>>> javax.security.sasl.SaslException: GSS initiate failed >>>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) >>>>> ~[?:1.7.0_67] >>>>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS >>>>> tartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3] >>>>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) >>>>> [libthrift-0.9.3.jar:0.9.3] >>>>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) >>>>> [libthrift-0.9.3.jar:0.9.3] >>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1 >>>>> .run(TUGIAssumingTransport.java:52) [classes/:?] >>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1 >>>>> .run(TUGIAssumingTransport.java:1) [classes/:?] >>>>> at java.security.AccessController.doPrivileged(Native Method) >>>>> ~[?:1.7.0_67] >>>>> at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67] >>>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) >>>>> [hadoop-common-2.7.2.jar:?] >>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o >>>>> pen(TUGIAssumingTransport.java:49) [classes/:?] >>>>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) >>>>> [classes/:?] >>>>> at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:182) >>>>> [classes/:?] >>>>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) >>>>> [classes/:?] >>>>> at java.sql.DriverManager.getConnection(DriverManager.java:571) >>>>> [?:1.7.0_67] >>>>> at java.sql.DriverManager.getConnection(DriverManager.java:187) >>>>> [?:1.7.0_67] >>>>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419) >>>>> [classes/:?] >>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>>> ~[?:1.7.0_67] >>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>>>> ~[?:1.7.0_67] >>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>>>> ~[?:1.7.0_67] >>>>> at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67] >>>>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) >>>>> [classes/:?] >>>>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?] >>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided >>>>> (Mechanism level: Failed to find any Kerberos tgt) >>>>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) >>>>> ~[?:1.7.0_67] >>>>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) >>>>> ~[?:1.7.0_67] >>>>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) >>>>> ~[?:1.7.0_67] >>>>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) >>>>> ~[?:1.7.0_67] >>>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) >>>>> ~[?:1.7.0_67] >>>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) >>>>> ~[?:1.7.0_67] >>>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) >>>>> ~[?:1.7.0_67] >>>>> ... 29 more >>>>> 17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with >>>>> status BAD and payload length 19 >>>>> 17/01/30 10:55:02 WARN HiveConnection: Failed to connect to >>>>> localhost:10000 >>>>> HS2 may be unavailable, check server status >>>>> Error: Could not open client transport with JDBC Uri: >>>>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD >>>>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed >>>>> (state=08S01,code=0) >>>>> beeline> >>>>> >>>>> ------------------------------ >>>>> *From:* Vivek Shrivastava >>>>> *Sent:* Monday, January 30, 2017 10:48:35 AM >>>>> *To:* user@hive.apache.org >>>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue >>>>> >>>>> Please paste the output of >>>>> 1. klist -fe >>>>> 2. relevant entries from HiveServer2 log >>>>> >>>>> On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo < >>>>> ricardo.fajardo@autodesk.com> wrote: >>>>> >>>>>> I could not resolve the problem. >>>>>> >>>>>> >>>>>> I have debugged the code and I found out that: >>>>>> >>>>>> >>>>>> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class >>>>>> line 208 >>>>>> >>>>>> .... >>>>>> >>>>>> UserGroupInformation.getCurrentUser return (). Two (.... >>>>>> >>>>>> .. >>>>>> >>>>>> This method always returns the user of the operative system but and I >>>>>> need authenticate the user set on the property: hive.server2.proxy.u >>>>>> ser=yourid because I have a token for this one. >>>>>> >>>>>> >>>>>> 2. I have found out that the hive.server2.proxy.user is implemented >>>>>> on the org.apache.hive.jdbc.HiveConnection class method: openSession() >>>>>> but this code is never executed. >>>>>> >>>>>> >>>>>> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there >>>>>> is this code on the method getAuthTransFactory(): >>>>>> >>>>>> .... >>>>>> >>>>>> if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) >>>>>> { >>>>>> // no-op >>>>>> .... >>>>>> >>>>>> It means that Kerberos authentication is not implemented? >>>>>> >>>>>> >>>>>> >>>>>> Please anyone can help me?? >>>>>> >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Richard. >>>>>> ------------------------------ >>>>>> *From:* Dulam, Naresh >>>>>> *Sent:* Thursday, January 26, 2017 8:41:48 AM >>>>>> *To:* user@hive.apache.org >>>>>> *Subject:* RE: Pls Help me - Hive Kerberos Issue >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Kinit yourid -k -t your.keytab yourid@MY-REALM.COM >>>>>> >>>>>> >>>>>> >>>>>> # Connect using following JDBC connection string >>>>>> >>>>>> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_ >>>>>> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com] >>>>>> *Sent:* Thursday, January 26, 2017 1:37 AM >>>>>> *To:* user@hive.apache.org >>>>>> *Subject:* Pls Help me - Hive Kerberos Issue >>>>>> >>>>>> >>>>>> >>>>>> Hello, >>>>>> >>>>>> >>>>>> >>>>>> Please I need your help with the Kerberos authentication with Hive. >>>>>> >>>>>> >>>>>> >>>>>> I am following this guide: >>>>>> >>>>>> https://www.cloudera.com/documentation/enterprise/5-4-x/topi >>>>>> cs/cdh_sg_hiveserver2_security.html#topic_9_1_1 >>>>>> >>>>>> But I am getting this error: >>>>>> >>>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided >>>>>> (Mechanism level: Failed to find any Kerberos tgt) >>>>>> >>>>>> >>>>>> >>>>>> I have a remote Kerberos server and I can generate a token with kinit >>>>>> for my user. I created a keytab file with my passwd for my user. Please >>>>>> tell me if it is ok. >>>>>> >>>>>> >>>>>> >>>>>> On the another hand when I am debugging the hive code the operative >>>>>> system user is authenticated but I need authenticate my Kerberos user, can >>>>>> you tell me how I can achieve that? How can I store my tickets where Hive >>>>>> can load it?? or How can I verify where Hive is searching the tickets and >>>>>> what Hive is reading?? >>>>>> >>>>>> >>>>>> >>>>>> Thanks so much for your help. >>>>>> >>>>>> >>>>>> >>>>>> Best regards, >>>>>> >>>>>> Richard. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> ------------------------------ >>>>>> This message, and any attachments, is for the intended recipient(s) >>>>>> only, may contain information that is privileged, confidential and/or >>>>>> proprietary and subject to important terms and conditions available at >>>>>> http://www.bankofamerica.com/emaildisclaimer. If you are not the >>>>>> intended recipient, please delete this message. >>>>>> >>>>> >>>>> >>>> >>> >> > --f403045f7320a64142054779b7d6 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
The attached file does not look like hive-site.xml. What i= s the value of=C2=A0hive.server2.authentication=C2=A0in hive-site.xml. Also y= our sasl. qop value should be one of three values, not all three. =C2=A0

On Mon, Jan 30= , 2017 at 4:28 PM, Ricardo Fajardo <ricardo.fajardo@autodesk.co= m> wrote:

Attached the hive-site.xml configuration file.


From:= Vivek Shrivastava <vivshrivastava@gmail.com>
Sent: Monday, January 30, 2017 4:10:42 PM

To: user@h= ive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue
=C2=A0
If this is working then your kerberos setup is= ok. I suspect configuration is Hiveserver2. What is the authentication and= security setup in Hive config? Please see if you can attach it.=C2=A0

On Mon, Jan 30, 2017 at 2= :33 PM, Ricardo Fajardo <ricar= do.fajardo@autodesk.com> wrote:

[cloudera@quickstart bin]$=C2=A0=
[cloudera@quickstart bin]$ hadoop fs -ls
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config=
Found 20 items
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-13 17:51 checkpoint
-rw-r--r-- =C2=A0 1 cloudera cloudera =C2=A0 =C2=A0 =C2=A0 3249 2016-05-= 11 16:19 hadoop.txt
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-02 16:15 hadoop2.txt
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-02 16:30 hadoop3.txt
drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-16 16:06 out1
-rw-r--r-- =C2=A0 1 cloudera cloudera =C2=A0 =C2=A0 =C2=A0 3868 2016-06-= 15 08:39 post.small0.xml
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-07-14 17:01 tCount1
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 15:57 test1
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:57 test10
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 17:33 test12
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:02 test2
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:24 test3
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:27 test4
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:32 test5
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:37 test6
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:49 test7
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:51 test8
drwxr-xr-x =C2=A0 - cloudera cloudera =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= 0 2016-06-21 16:54 test9
-rw-r--r-- =C2=A0 1 cloudera cloudera =C2=A0 =C2=A08481022 2016-06-08 21= :51 train.tsv
[cloudera@quickstart bin]$=C2=A0=
[cloudera@quickstart bin]$=C2=A0=
[cloudera@quickstart bin]$=C2=A0=
[cloudera@quickstart bin]$ echo $HADOOP_OPTS=C2=A0<= /div>
-Dsun.security.krb5.debug=3Dtrue
[cloudera@quickstart bin]$=C2=A0=


From: Vivek Shrivastava <vivshrivastava@gmail.com>
Sent: Monday, January 30, 2017 2:28:53 PM

To: user@h= ive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue
=C2=A0
If you are using AES256, then please do update= java unlimited strength jar files. What is the output of hadoop ls command= after exporting the below environment variable?

export HADOOP_OPTS=3D"-Dsun.security.krb5.deb= ug=3Dtrue"
hadoop fs -ls /

On Mon, Jan 30, 2017 at 2= :21 PM, Ricardo Fajardo <ricar= do.fajardo@autodesk.com> wrote:

I did the changes but I am getting th= e same error.

Klist:

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501=
Default principal: t_fajar@ADS.A= UTODESK.COM

Valid starting =C2=A0 =C2=A0 Expires =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0Service principal
01/30/17 11:56:20 =C2=A001/30/17 21:56:24 =C2=A0krbtgt/ADS.AUTODESK.COM@ADS.= AUTODESK.COM
renew until 01/31/17 11:56:20, Flags: FPRIA
Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac=C2=A0


Log:

[cloudera@qui= ckstart bin]$ export HADOOP_OPTS=3D"-Dsun.security.krb5.debug=3Dtrue"<= /font>
[cloudera@quickstart bin]$=C2=A0
[cloudera@quickstart bin]$=C2=A0
[cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:100= 00/default;principal=3Dhive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.us= er=3Dt_fajar"
/home/cloudera/workspace/hive/bin/hive: line 99: [: /home/cloudera/= workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar: binary oper= ator expected
SLF4J: Class path contains multiple SLF4J bindin= gs.
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib= /benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]<= /font>
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib= /hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/slf4j/impl/StaticLo= ggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib= /spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLogge= rBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib= /spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLogge= rBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4= j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an expl= anation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory= ]
Connecting to jdbc:hive2://localhost:10000/default;prin= cipal=3Dhive/_HO= ST@ADS.AUTODESK.COM;hive.server2.proxy.user=3Dt_fajar
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config=
17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SA= SL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed<= /font>
at com.sun.security.sasl.gsskerb.GssKr= b5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_73]
at org.apache.thrift.transport.TSaslCl= ientTransport.handleSaslStartMessage(TSaslClientTransport.java:94= ) ~[benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslTr= ansport.open(TSaslTransport.java:271) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslCl= ientTransport.open(TSaslClientTransport.java:37) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.clien= t.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.clien= t.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at java.security.AccessController.doPr= ivileged(Native Method) ~[?:1.8.0_73]
at javax.s= ecurity.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73]
at org.apache.hadoop.security.UserGrou= pInformation.doAs(UserGroupInformation.java:1657) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.clien= t.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection= .openTransport(HiveConnection.java:227) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection= .<init>(HiveConnection.java:182) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveDriver.con= nect(HiveDriver.java:107) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at java.sql.DriverManager.getConnectio= n(DriverManager.java:664) [?:1.8.0_73]
at java.sql.DriverManager.getConnectio= n(DriverManager.java:208) [?:1.8.0_73]
at org.apache.hive.beeline.DatabaseCon= nection.connect(DatabaseConnection.java:145) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.DatabaseCon= nection.getConnection(DatabaseConnection.java:209) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.Commands.co= nnect(Commands.java:1524) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.Commands.co= nnect(Commands.java:1419) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at sun.reflect.NativeMethodAccessorImp= l.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImp= l.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccesso= rImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Met= hod.java:497) ~[?:1.8.0_73]
at org.apache.hive.beeline.ReflectiveC= ommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.BeeLine.exe= cCommandWithPrefix(BeeLine.java:1127) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.BeeLine.dis= patch(BeeLine.java:1166) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.BeeLine.ini= tArgs(BeeLine.java:797) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.BeeLine.beg= in(BeeLine.java:885) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.BeeLine.mai= nWithInputRedirection(BeeLine.java:511) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at org.apache.hive.beeline.BeeLine.mai= n(BeeLine.java:494) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]=
at sun.reflect.Nati= veMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.Nati= veMethodAccessorImpl.invoke(NativeMethodAcce= ssorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.Dele= gatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflec= t.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hadoop.util.RunJar.run(R= unJar.java:221) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.util.RunJar.main(= RunJar.java:136) [benchmarks.jar:2.2.0-SNAPSHOT]
Caused by: org.ietf.jgss.GSSException: No valid = credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCred= ential.getInstance(Krb5InitCredential.java:147) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFact= ory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFact= ory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.8.0_73]
at sun.security.jgss.GSSManagerImpl.ge= tMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.in= itSecContext(GSSContextImpl.java:212) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.in= itSecContext(GSSContextImpl.java:179) ~[?:1.8.0_73]
at com.sun.security.sasl.gsskerb.GssKr= b5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_73]
... 35 more
17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed t= o connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:h= ive2://localhost:10000/default;principal=3Dhive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.= user=3Dt_fajar: GSS initiate failed (state=3D08S01,code=3D0)
Beeline version 2.2.0-SNAPSHOT by Apache Hive<= /font>
beeline>=C2=A0



From: Vivek Shrivastava <vivshrivastava@gmail= .com>
Sent: Monday, January 30, 2017 11:34:27 AM

To: user@hive.apache.org
Subject: Re: Pls Help me - Hiv= e Kerberos Issue
=C2=A0
You can comment both=C2=A0default_tkt_enctypes and=C2= =A0default_tgs_enctypes=C2=A0out, the default value will become=C2=A0aes256-cts-hmac-sha1-96AES128-CTS-HMA= C-SHA1-96 <= font>des3-c BC-SHA1 arcfour-= HMAC-MD5 camel lia256 C= TS-CMAC camellia= 128-CT with CMA= C- des-cbc-= crc des-cbc-= md5 des-cbc-= MD4 =C2=A0.=C2=A0 =C2=A0=C2=A0=C2=A0<= span style=3D"color:rgb(0,0,0);font-family:monospace;font-size:medium;backg= round-color:rgb(238,238,236)">=C2=A0=C2=A0=C2=A0
Then do
kdestroy
kinit=C2=A0
klist -fev
your beeline command

if still does not work then paste the output of

export=C2=A0HADOOP_OPTS=3D= "-Dsun.security.krb5.debug=3Dtrue"
hadoop fs -ls = /



On Mon, Jan 30, 2017 at 1= 1:11 AM, Ricardo Fajardo <ricar= do.fajardo@autodesk.com> wrote:

<= font>I don't have any particular reason for selecting arcfo= ur encryption type. If I need to change it and it will work I can do.
<= font>
Values = from krb5.conf:=C2=A0

[Libdefaults]
=C2=A0 =C2=A0 =C2=A0 =C2=A0 default_realm =3D = ADS.AUTODESK.COM
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 krb4_config =3D /etc/krb.conf
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 krb4_realms =3D /etc/krb.realms
=C2=A0 =C2=A0 =C2=A0 =C2=A0 kdc_ti= mesync =3D 1
=C2=A0 =C2=A0 =C2=A0 =C2=A0 ccache= _type =3D 4
=C2=A0 =C2=A0 =C2=A0 =C2=A0 forwar= dable =3D true
=C2=A0 =C2=A0 =C2=A0 =C2=A0 proxia= ble =3D true
=C2=A0 =C2=A0 =C2=A0 =C2=A0 v4_instance_resolve =3D false
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 v4_name_convert =3D {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 =C2=A0 =C2=A0 host =3D {<= /div>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 rcmd =3D host
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ftp =3D ftp<= /font>
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 }
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 =C2=A0 =C2=A0 plain =3D {=
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 something =3D something-e= lse
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 }
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 }
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 fcc-mit-ticketflags =3D true
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 default_tkt_enctypes =3D RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS<= /font>
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 default_tgs_enctypes =3D RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS=

[realms]

=C2=A0 =C2=A0 =C2=A0 =C2=A0 ADS.AUTODESK.COM = =3D {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 kdc =3D krb= .ad= s.autodesk.com: 88
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 admin_server =3D krb= .ad= s.autodesk.com
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 defa= ult_domain =3D ads.autodesk.com<= /font>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 database_module =3D openldap_ldapconf
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 master_key_type =3D aes256-cts
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 supported_enctypes =3D aes256-cts:normal aes128-cts:normal des3-hma= c-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal d= es-cbc-crc:normal
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =C2=A0 default_principal_flags =3D +preauth
=C2=A0 =C2=A0 =C2=A0 =C2=A0 }=

Thanks so much for your help,
Richard.

From: Vivek Shrivastava <vivshrivastava@gmail.com>
Sent: Monday, January 30, 2017 11:01:24 AM

To: user@hive.apache.org
Subject: Re: Pls Help me - Hiv= e Kerberos Issue
=C2=A0
Any particular reason for selecting arcfour en= cryption type? Could you please post defaults (e.g enc_type) values from kr= b5.conf=C2=A0

On Mon, Jan 30, 2017 at 1= 0:57 AM, Ricardo Fajardo <ricar= do.fajardo@autodesk.com> wrote:


1= . klist -fe=C2=A0

[cloudera@quickstart bin]$ kl= ist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM

Valid starting =C2=A0 =C2=A0 Expires =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0Service principal
01/30/17 = 10:52:37 =C2=A001/30/17 20:52:43 =C2=A0krbtgt/ADS.AUTODESK.COM@ADS.AUTODES= K.COM
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac=C2=A0
[cloudera@quickstart bin]$=C2=A0

2. relevant entries from HiveServer2 log


beeline> !connect jdbc:hive2://localhost:10000/d= efault;principal=3Dhive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=3Dt_fajar<= /font>
!connect = jdbc:hive2://localhost:10000/default;principal=3Dhive/_HOST@ADS.= =C2=A0
AUTODESK.COM;hive.server2.pr= oxy.user=3Dt_fajar
SLF4J: Class path contains multiple SLF4J bindings.=
SLF4J: Fo= und binding in [jar:file:/home/cloudera/.m2/repository/org/apache/logg= ing/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/= slf4j/impl/StaticLoggerBinder.class]
SLF4J: Fo= und binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j= -log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLogg= erBinder.class]
SLF4J: Fo= und binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j= -log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLo= ggerBinder.class]
SLF4J: Se= e http://www.slf4j.org/codes.html#multiple_bindings for an explanati= on.
SLF4J: Ac= tual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]<= /span>
Connecting to jdbc:hive2://localhost:10000/default;principal=3Dhive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=3Dt_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000=
17/01/27 = 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib<= wbr>.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMe= trics.loginSuccess with annotation @org.ap= ache.hadoop.metrics2.annotation.Metric(valueName=3DTime, valu= e=3D[Rate of successful kerberos logins and latency (milliseconds)], about= =3D, type=3DDEFAULT, always=3Dfalse, sampleName=3DOps)
17/01/27 = 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib<= wbr>.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMe= trics.loginFailure with annotation @org.ap= ache.hadoop.metrics2.annotation.Metric(valueName=3DTime, valu= e=3D[Rate of failed kerberos logins and latency (milliseconds)], about=3D, = type=3DDEFAULT, always=3Dfalse, sampleName=3DOps)
17/01/27 = 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib<= wbr>.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMe= trics.getGroups with annotation @org.ap= ache.hadoop.metrics2.annotation.Metric(valueName=3DTime, valu= e=3D[GetGroups], about=3D, type=3DDEFAULT, always=3Dfalse, sampleName=3DOps= )
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group re= lated metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups: =C2=A0Creating new Groups object<= /font>
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-buil= t native-hadoop library...
17/01/27 = 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: j= ava.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 = 16:16:37 DEBUG NativeCodeLoader: java.library.path=3D/usr/java/package= s/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop li= brary for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based=
17/01/27 = 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping im= pl=3Dorg.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 = 16:16:38 DEBUG Groups: Group mapping impl=3Dorg.apache.hadoop.security= .JniBasedUnixGroupsMappingWithFallback; cacheTimeout=3D300000; warning= DeltaMs=3D5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login=
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit=
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrinc= ipal: cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrin= cipal: cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera= "
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera (au= th:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod =3D S= IMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as pass= ed-in authMethod of kerberos !=3D current.
17/01/30 = 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIM= PLE) from:or= g.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.cre= ateClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 = 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIM= PLE) from:or= g.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open= (TUGIAssumingTransport.java:49)
17/01/30 = 10:55:02 DEBUG TSaslTransport: opening transport org.apache.thrift.transpor= t.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure<= /font>
javax.security.sasl.SaslException: GSS in= itiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Gs= sKrb5Client.java:212) ~[?:1.7.0_67]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStart= Message(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.j= ava:271) [libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClien= tTransport.java:37) [libthrift-0.9.3.jar:0.9.3]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run= (TUGIAssumingTransport.java:52) [classes/:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run= (TUGIAssumingTransport.java:1) [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_= 67]
at javax.s= ecurity.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupIn= formation.java:1657) [hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(= TUGIAssumingTransport.java:49) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection= .java:227) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes= /:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0= _67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0= _67]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConne= ction.java:145) [classes/:?]
at org.apache.hive.beeline.DatabaseConnection.getConnection(Databas= eConnection.java:209) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes= /:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes= /:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0= _67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessor= Impl.java:57) ~[?:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethod= AccessorImpl.java:43) ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]=
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(Reflect= iveCommandHandler.java:56) [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.ja= va:1127) [classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/= :?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?= ]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]<= /span>
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine= .java:511) [classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Me= chanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCr= edential.java:147) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5= MechFactory.java:121) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5M= echFactory.java:187) ~[?:1.7.0_67]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerI= mpl.java:223) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.j= ava:212) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.j= ava:179) ~[?:1.7.0_67]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Gs= sKrb5Client.java:193) ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with sta= tus BAD and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10= 000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://local= host:10000/default;principal=3Dhive/_= HOST@ADS.AUTODESK.COM;hive.server2.proxy.user= =3Dt_fajar: GSS initiate failed (state=3D08S01,code=3D0)
beeline>=C2=A0


From: Vivek Shrivastava <vivshrivastava@gmail.com>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@h= ive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue
=C2=A0
Please paste the output of
1. klist -fe=C2=A0
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 1= 0:11 AM, Ricardo Fajardo <ricar= do.fajardo@autodesk.com> wrote:

I could not resolve the problem.=C2= =A0


I have debugged=C2=A0the code and I = found out that:


1. On the = org.apache.hadoop.hive.thrift.= HadoopThriftAuthBridge=C2=A0<= /span>class =C2=A0=C2=A0line= 208=C2=A0

....

UserGroupInformation.getCurrentUs= er return (). Two (....=C2=A0

..

This method always returns the user of=C2=A0the operative system but=C2=A0and I= need authenticate the user set on the property:=C2=A0= hive.server2.proxy.u<= wbr>ser=3Dyourid=C2=A0because I have a token for this one.


2. I ha= ve found out that the=C2=A0hive.serve= r2.proxy.user=C2=A0is implemented on the=C2=A0org.apache.hive.jdbc.HiveConnection clas= s method:=C2=A0openSession() but this code is never executed.<= span style=3D"color:rgb(33,33,33)">


3. On the org.apache.hive.service.auth.HiveAuthFacto= ry=C2=A0class there is this code on the = method=C2=A0getAuthTransFactory():

....=

=

=C2=A0 =C2=A0 =C2=A0 if (authTypeStr.equalsIgnoreCase(AuthTypes.KER= BEROS.getAuthName())) {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 // no-op
....

It means that Kerberos authenticatio= n is not implemented?



Please anyone can help me??=


Thanks,

Richard.


From: Dulam, Naresh <naresh.dulam@bankofamerica.com>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@h= ive.apache.org
Subject: RE: Pls Help me - Hive Kerberos Issue
=C2=A0

=C2=A0

Kinit =C2=A0=C2=A0yourid -k -t your.keytab yourid@MY-REALM.CO= M

=C2=A0

# Connect using following JDBC connection string

# jdbc:hive2://myHost.myOrg.com:10000/default;principal=3Dhi= ve/_HOST@MY-REALM.COM;hive.server2.proxy.user=3Dyourid=

=C2=A0

=C2=A0

=C2=A0

=C2=A0

=C2=A0

=C2=A0

From: Ricardo Fajardo [mailto:ricardo.fajardo= @autodesk.com]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@h= ive.apache.org
Subject: Pls Help me - Hive Kerberos Issue

=C2=A0

Hello,

<= u>=C2=A0

P= lease I need your help with the Kerberos authentication with Hive.

<= u>=C2=A0

I= am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x= /topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

B= ut I am getting this error:

<= font>Caused by: org.ietf.jgss.GSSException: No valid credentials provided (= Mechanism level: Failed to find any Kerberos tgt)

<= u>=C2=A0

I have a remote Kerberos server and I can generate a tok= en with kinit for my user. I created a keytab file with my passwd for my us= er. Please tell me if it is ok.=C2=A0

=C2=A0

On the another hand when I am debugging the hive code th= e operative system user is authenticated but I need authenticate my Kerbero= s user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or=C2=A0How can I veri= fy where Hive is searching the tickets and what Hive is reading??=

=C2=A0

Thanks so much for your help.

=C2=A0

Best regards,

Richard.<= /p>

=C2=A0

=C2=A0


This message, and any attachments, is for the intended recipient(s) only, m= ay contain information that is privileged, confidential and/or proprietary = and subject to important terms and conditions available at = http://www.bankofamerica.com/emaildisclaimer. If you are not the i= ntended recipient, please delete this message.






--f403045f7320a64142054779b7d6--