Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id D12C6200B64 for ; Tue, 2 Aug 2016 18:11:49 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id CFB98160A76; Tue, 2 Aug 2016 16:11:49 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 53E70160A65 for ; Tue, 2 Aug 2016 18:11:48 +0200 (CEST) Received: (qmail 80912 invoked by uid 500); 2 Aug 2016 16:11:47 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 80900 invoked by uid 99); 2 Aug 2016 16:11:46 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 02 Aug 2016 16:11:46 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 2342B1A5266 for ; Tue, 2 Aug 2016 16:11:46 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.28 X-Spam-Level: * X-Spam-Status: No, score=1.28 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=platalytics-com.20150623.gappssmtp.com Received: from mx2-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id Dn1bBO7M9F7V for ; Tue, 2 Aug 2016 16:11:36 +0000 (UTC) Received: from mail-ua0-f169.google.com (mail-ua0-f169.google.com [209.85.217.169]) by mx2-lw-eu.apache.org (ASF Mail Server at mx2-lw-eu.apache.org) with ESMTPS id 88CE45FB37 for ; Tue, 2 Aug 2016 16:11:35 +0000 (UTC) Received: by mail-ua0-f169.google.com with SMTP id 35so132769304uap.1 for ; Tue, 02 Aug 2016 09:11:35 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=platalytics-com.20150623.gappssmtp.com; s=20150623; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=DGlCxMxn1Fy/Do800jMcmN7s5vbFVXBUQi4ePna2mOs=; b=WvtQhZg/tnCi1pSBlCFxQ+Vjy21CH5Po3oE/Z2QNmo0CGfA0UzBxIOBl4E/1SMTg/6 HZkEPkXjO2ZLMzr9Qr+73TEaHHzFF0OS2NqIrlny2E4X/vqWmW30mMY07lxFecDypeok gOARlDq2WlmAS5HvPBCGOQ3MFZ4ap9ZzV3iNyGP5hCoqMfssaqgGKDF/XkyFKZoUL3C2 yvQH4xb5q008AJNXMs+89tbYnAMcUNLCW3cYXCIwQuYJDDFMMeRplaacKG7RQ1f4yK3Q H73ILMfOVsQ6M1yJd2ymHbBTnY5cZvsXzz/XmCuAyhu3beEdBel81UfeAih+GGil/f4i tedw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=DGlCxMxn1Fy/Do800jMcmN7s5vbFVXBUQi4ePna2mOs=; b=bodSaOkAiBjMOL0uBaDxjXeBMOjIjtz19g7+5pQTruaDbCxcTxEZWzG2GNqjKRuzQj mEwCMSA9VC3RyKz7ngZYG5m4qIyRgA+sLO4MtDDNHapQjAE9AoGH1PTQQC4rYkozFnUV fygMtmmsVe6Xcctkfk5gxLb0xcFeUIXJxY0WFOt3UnPnW2Wr8BfDdRHbaRqIyZZ6wJRh YOqmWs0sI5I5mUmJXOBXBinYzKO67E2RPy+7plUjEArbAVBiM0f3H7gWkY/oplpupb63 f589GoW5XR8WoZJbuTsu558OPzRjb0wK+tyyXIU3BMJNcyWoja6XAQC/93IL27XTqv/K hQ+g== X-Gm-Message-State: AEkoous0ECT5iff1KCUKM5Y7cmLcrXpYbl9yYRjkuKifqSSIUZqQ059+ku4knpCmlKVsDSgfQ8ABT9Pze39mHg== X-Received: by 10.159.55.168 with SMTP id q37mr31341647uaq.1.1470154293607; Tue, 02 Aug 2016 09:11:33 -0700 (PDT) MIME-Version: 1.0 Received: by 10.103.42.69 with HTTP; Tue, 2 Aug 2016 09:11:32 -0700 (PDT) In-Reply-To: References: From: Aneela Saleem Date: Tue, 2 Aug 2016 21:11:32 +0500 Message-ID: Subject: Re: issue starting regionserver with SASL authentication failed To: user@hbase.apache.org Content-Type: multipart/alternative; boundary=94eb2c03f9926f9720053918f8a8 archived-at: Tue, 02 Aug 2016 16:11:50 -0000 --94eb2c03f9926f9720053918f8a8 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable I have enabled Kerberos debugging in Hadoop command line, so when i run the "hadoop fs -ls /" command, i get following output, I can't interpret this. Can you please tell me is something wrong with Kerberos configuration or everything is fine ? 16/08/02 18:34:10 DEBUG util.Shell: setsid exited with exit code 0 16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL jar:file:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.7.2.jar!/cor= e-default.xml 16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@4fbc7b65 16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL file:/usr/local/hadoop/etc/hadoop/core-site.xml 16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@69c1adfa 16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=3DTime, value=3D[Ra= te of successful kerberos logins and latency (milliseconds)], about=3D, always=3Dfalse, type=3DDEFAULT, sampleName=3DOps) 16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=3DTime, value=3D[Ra= te of failed kerberos logins and latency (milliseconds)], about=3D, always=3Dfalse, type=3DDEFAULT, sampleName=3DOps) 16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=3DTime, value=3D[GetGroups], about=3D, always=3Dfalse, type=3DDEFAULT, sampleName= =3DOps) 16/08/02 18:34:11 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics Java config name: null Native config name: /etc/krb5.conf Loaded from native config 16/08/02 18:34:11 DEBUG security.Groups: Creating new Groups object 16/08/02 18:34:11 DEBUG security.Groups: Group mapping impl=3Dorg.apache.hadoop.security.LdapGroupsMapping; cacheTimeout=3D300000; warningDeltaMs=3D5000 >>>KinitOptions cache name is /tmp/krb5cc_0 >>>DEBUG client principal is nn/hadoop-master@platalyticsrealm >>>DEBUG server principal is krbtgt/platalyticsrealm@platalyticsrealm >>>DEBUG key type: 16 >>>DEBUG auth time: Tue Aug 02 18:23:59 PKT 2016 >>>DEBUG start time: Tue Aug 02 18:23:59 PKT 2016 >>>DEBUG end time: Wed Aug 03 06:23:59 PKT 2016 >>>DEBUG renew_till time: Tue Aug 09 18:23:59 PKT 2016 >>> CCacheInputStream: readFlags() FORWARDABLE; RENEWABLE; INITIAL; >>>DEBUG client principal is nn/hadoop-master@platalyticsrealm >>>DEBUG server principal is X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/platalyticsrealm@plata= lyticsrealm >>>DEBUG key type: 0 >>>DEBUG auth time: Thu Jan 01 05:00:00 PKT 1970 >>>DEBUG start time: null >>>DEBUG end time: Thu Jan 01 05:00:00 PKT 1970 >>>DEBUG renew_till time: null >>> CCacheInputStream: readFlags() 16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login 16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login commit 16/08/02 18:34:11 DEBUG security.UserGroupInformation: using kerberos user:nn/hadoop-master@platalyticsrealm 16/08/02 18:34:11 DEBUG security.UserGroupInformation: Using user: "nn/hadoop-master@platalyticsrealm" with name nn/hadoop-master@platalyticsrealm 16/08/02 18:34:11 DEBUG security.UserGroupInformation: User entry: "nn/hadoop-master@platalyticsrealm" 16/08/02 18:34:11 DEBUG security.UserGroupInformation: UGI loginUser:nn/hadoop-master@platalyticsrealm (auth:KERBEROS) 16/08/02 18:34:12 DEBUG security.UserGroupInformation: Found tgt Ticket (hex) =3D 0000: 61 82 01 72 30 82 01 6E A0 03 02 01 05 A1 12 1B a..r0..n........ 0010: 10 70 6C 61 74 61 6C 79 74 69 63 73 72 65 61 6C .platalyticsreal 0020: 6D A2 25 30 23 A0 03 02 01 02 A1 1C 30 1A 1B 06 m.%0#.......0... 0030: 6B 72 62 74 67 74 1B 10 70 6C 61 74 61 6C 79 74 krbtgt..platalyt 0040: 69 63 73 72 65 61 6C 6D A3 82 01 2A 30 82 01 26 icsrealm...*0..& 0050: A0 03 02 01 10 A1 03 02 01 01 A2 82 01 18 04 82 ................ 0060: 01 14 A5 A9 41 A6 B7 0E 8F 70 F4 03 41 64 8D DC ....A....p..Ad.. 0070: 78 2F FB 08 58 C9 39 44 CF D0 8D B0 85 09 62 8C x/..X.9D......b. 0080: 40 CF 45 13 D3 B9 CD 38 84 92 33 24 B2 0D C1 65 @.E....8..3$...e 0090: C7 1B 0D 3E F2 92 A2 8B 58 34 77 5F F6 E3 AA B6 ...>....X4w_.... 00A0: EB 8E 58 46 AC 54 DB 9B 79 3E ED A1 83 0C D3 D3 ..XF.T..y>...... 00B0: 02 8B 42 52 6D 92 F1 39 BA E7 56 D4 BA A6 03 B6 ..BRm..9..V..... 00C0: 16 5A DC 1A 69 F4 DF A5 CD F6 48 AC 08 32 D3 AD .Z..i.....H..2.. 00D0: 22 8E E9 52 00 93 78 41 1C 26 4F 0B 42 2C EF E9 "..R..xA.&O.B,.. 00E0: B8 0E 84 39 E4 AF 3A 60 7D 04 EE 70 18 C0 E7 21 ...9..:`...p...! 00F0: 0B 70 18 42 33 5E D9 CA 94 C0 6F 6A C0 39 72 7B .p.B3^....oj.9r. 0100: FD 6E F1 09 CE 2D 02 EA DA 52 5C 1B B2 18 36 0E .n...-...R\...6. 0110: 54 94 DD 7A 47 A8 F2 36 53 18 3D D7 5C 68 58 71 T..zG..6S.=3D.\hXq 0120: 63 DB 36 88 B9 87 62 DC BA 86 C3 F0 55 05 D8 15 c.6...b.....U... 0130: 6E 70 FD 8E 64 63 3D 51 36 EC 9E 63 30 77 BE 98 np..dc=3DQ6..c0w.. 0140: 1D A0 DC 97 04 6F 03 AB 12 52 F8 68 7C 6C D0 88 .....o...R.h.l.. 0150: 16 FC 17 69 3E 02 4B 59 E8 22 B3 1B 13 70 B2 6A ...i>.KY."...p.j 0160: 3F 05 3B 1C 91 3D 03 A8 30 64 1C B1 59 42 17 FB ?.;..=3D..0d..YB.. 0170: 1B B2 76 E0 BC 49 ..v..I Client Principal =3D nn/hadoop-master@platalyticsrealm Server Principal =3D krbtgt/platalyticsrealm@platalyticsrealm Session Key =3D EncryptionKey: keyType=3D16 keyBytes (hex dump)=3D 0000: B5 4A 9B 0E 1C 6D 1C 34 D5 DF DA F2 9D 4C C2 FE .J...m.4.....L.. 0010: D9 0D 67 A2 79 6D 8C 0D ..g.ym.. Forwardable Ticket true Forwarded Ticket false Proxiable Ticket false Proxy Ticket false Postdated Ticket false Renewable Ticket true Initial Ticket true Auth Time =3D Tue Aug 02 18:23:59 PKT 2016 Start Time =3D Tue Aug 02 18:23:59 PKT 2016 End Time =3D Wed Aug 03 06:23:59 PKT 2016 Renew Till =3D Tue Aug 09 18:23:59 PKT 2016 Client Addresses Null 16/08/02 18:34:12 DEBUG security.UserGroupInformation: Current time is 1470144852023 16/08/02 18:34:12 DEBUG security.UserGroupInformation: Next refresh is 1470178799000 16/08/02 18:34:12 TRACE tracing.SpanReceiverHost: No span receiver names found in dfs.client.htrace.spanreceiver.classes. 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local =3D false 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit =3D false 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic =3D false 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =3D 16/08/02 18:34:12 DEBUG retry.RetryUtils: multipleLinearRandomRetry =3D nul= l 16/08/02 18:34:12 DEBUG ipc.Server: rpcKind=3DRPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=3Dclass org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=3Dorg.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker@4219a40f 16/08/02 18:34:12 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@5e0df7af 16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library... 16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library 16/08/02 18:34:13 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@1a1ff7d1: starting with interruptCheckPeriodMs =3D 60000 16/08/02 18:34:13 TRACE unix.DomainSocketWatcher: DomainSocketWatcher(1934811148): adding notificationSocket 191, connected to 190 16/08/02 18:34:13 DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled. 16/08/02 18:34:13 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 16/08/02 18:34:13 TRACE ipc.ProtobufRpcEngine: 1: Call -> / 192.168.23.206:8020: getFileInfo {src: "/"} 16/08/02 18:34:13 DEBUG ipc.Client: The ping interval is 60000 ms. 16/08/02 18:34:13 DEBUG ipc.Client: Connecting to /192.168.23.206:8020 16/08/02 18:34:13 DEBUG security.UserGroupInformation: PrivilegedAction as:nn/hadoop-master@platalyticsrealm (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724= ) 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=3D\"default\",nonce=3D\"xHi0jI3ZHzKXd2aQ0Gqx4N1qcgbdJAWBCa36ZeSO\",q= op=3D\"auth\",charset=3Dutf-8,algorithm=3Dmd5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "hadoop-master" } 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=3Dclass org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=3D, serverPrincipal=3Ddfs.namenode.kerberos.principal) 16/08/02 18:34:13 DEBUG security.SaslRpcClient: RPC Server's Kerberos principal name for protocol=3Dorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/hadoop-master@platalyticsrealm 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop-master 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Use KERBEROS authentication for protocol ClientNamenodeProtocolPB Found ticket for nn/hadoop-master@platalyticsrealm to go to krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59 PKT 2016 Entered Krb5Context.initSecContext with state=3DSTATE_NEW Found ticket for nn/hadoop-master@platalyticsrealm to go to krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59 PKT 2016 Service ticket not found in the subject >>> Credentials acquireServiceCreds: same realm Using builtin default etypes for default_tgs_enctypes default etypes for default_tgs_enctypes: 18 17 16 23 1 3. >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType >>> KdcAccessibility: reset >>> KrbKdcReq send: kdc=3Dplatalytics.com UDP:88, timeout=3D30000, number o= f retries =3D3, #bytes=3D727 >>> KDCCommunication: kdc=3Dplatalytics.com UDP:88, timeout=3D30000,Attempt= =3D1, #bytes=3D727 >>> KrbKdcReq send: #bytes read=3D686 >>> KdcAccessibility: remove platalytics.com >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType >>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000 >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType Krb5Context setting mySeqNumber to: 822249937 Created InitSecContextToken: 0000: 01 00 6E 82 02 67 30 82 02 63 A0 03 02 01 05 A1 ..n..g0..c...... 0010: 03 02 01 0E A2 07 03 05 00 20 00 00 00 A3 82 01 ......... ...... 0020: 6F 61 82 01 6B 30 82 01 67 A0 03 02 01 05 A1 12 oa..k0..g....... 0030: 1B 10 70 6C 61 74 61 6C 79 74 69 63 73 72 65 61 ..platalyticsrea 0040: 6C 6D A2 1E 30 1C A0 03 02 01 00 A1 15 30 13 1B lm..0........0.. 0050: 02 6E 6E 1B 0D 68 61 64 6F 6F 70 2D 6D 61 73 74 .nn..hadoop-mast 0060: 65 72 A3 82 01 2A 30 82 01 26 A0 03 02 01 10 A1 er...*0..&...... 0070: 03 02 01 04 A2 82 01 18 04 82 01 14 25 56 29 BE ............%V). 0080: 2E AA 50 55 7B 2C 5C AC BA 64 2D 4D 8D 9C 71 B1 ..PU.,\..d-M..q. 0090: 1A 99 14 81 4C 98 80 B2 65 86 6C 37 61 67 31 D1 ....L...e.l7ag1. 00A0: 6F F6 E7 7A F3 92 A5 9A F0 BA A5 BE 1C 15 7F 14 o..z............ 00B0: 85 7E B0 7A 81 3D 9C B6 00 80 43 00 2A 0C 89 6A ...z.=3D....C.*..j 00C0: B1 49 EF 27 F9 97 A1 3E 5C 80 B7 0D 49 6C E0 A3 .I.'...>\...Il.. 00D0: 73 BC C2 69 AE 92 88 26 C5 DA FD 6E AB 55 F7 60 s..i...&...n.U.` 00E0: D0 7E 3A A5 5D 78 4E 3F 3D 96 44 6B B9 8F EA D8 ..:.]xN?=3D.Dk.... 00F0: 4E BA 70 F3 5C 25 4E ED AD E2 76 09 FF 36 D8 6D N.p.\%N...v..6.m 0100: A4 22 C3 93 10 04 04 F2 6C D4 04 C9 A9 14 95 47 ."......l......G 0110: 16 BA 62 6F 58 5F 4F 8E 38 23 A5 5C 1D 58 F8 D5 ..boX_O.8#.\.X.. 0120: 87 23 3D 7F 0B A7 BE 18 25 1F F1 7B 4C 54 EC BD .#=3D.....%...LT.. 0130: A6 D4 05 4C 82 03 64 FD 5A 4E 24 D8 71 D5 5A 15 ...L..d.ZN$.q.Z. 0140: 4C 2E E3 12 88 19 19 09 C1 F9 31 9D 6E CE D4 6F L.........1.n..o 0150: 7A 20 F6 82 BB F6 28 D1 ED A3 54 69 01 9E A4 4C z ....(...Ti...L 0160: 40 E2 E0 FC F5 35 44 C1 25 8C 50 1F C0 01 1D C0 @....5D.%.P..... 0170: 63 A5 45 B8 56 DF F7 F8 CA 86 8B 96 0C 5C 49 EA c.E.V........\I. 0180: F0 A9 70 9C 2E 0E 36 57 65 47 97 09 8C 24 F1 00 ..p...6WeG...$.. 0190: A4 81 DA 30 81 D7 A0 03 02 01 10 A2 81 CF 04 81 ...0............ 01A0: CC F1 F6 BE 3A A7 C0 1A 04 D0 72 DE 57 94 D1 FE ....:.....r.W... 01B0: 16 7E E8 09 72 D7 83 54 B3 1C 98 59 36 86 78 12 ....r..T...Y6.x. 01C0: A5 02 E3 B6 8C C6 83 B5 C9 7C 53 A3 C9 79 AF C8 ..........S..y.. 01D0: B8 1A B3 B2 A6 7E 02 1A A5 9C 41 EA 08 87 A8 E5 ..........A..... 01E0: D1 0E ED 69 5C CA 33 63 24 C8 4B E1 57 D5 C3 AF ...i\.3c$.K.W... 01F0: 39 0A DE F6 9F 63 3B 44 79 5B 29 F7 9A B0 2E 8B 9....c;Dy[)..... 0200: 1C EF 4A 0B D9 3A 55 75 C5 38 B7 5C 50 11 0E 74 ..J..:Uu.8.\P..t 0210: BE 57 DC 70 30 DD AF 14 35 97 1C 14 11 70 46 FD .W.p0...5....pF. 0220: F9 8C 14 60 DE 35 D8 DC 81 86 C7 31 1F F8 6A 65 ...`.5.....1..je 0230: 2D B7 8A EF F2 61 21 00 2C 8D 4F 3A 49 1E 24 80 -....a!.,.O:I.$. 0240: FA 56 D0 2D 0E 52 AE 29 2B 6A 4A C7 16 8F B5 D8 .V.-.R.)+jJ..... 0250: EC 41 18 03 34 F2 D8 94 79 82 C8 0D E2 10 72 39 .A..4...y.....r9 0260: 85 B9 F7 BB 54 5C 71 21 49 23 A5 4A D0 ....T\q!I#.J. 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state: INITIATE token: "`\202\002x\006\t*\206H\206\367\022\001\002\002\001\000n\202\002g0\202\002c= \240\003\002\001\005\241\003\002\001\016\242\a\003\005\000 \000\000\000\243\202\001oa\202\001k0\202\001g\240\003\002\001\005\241\022\0= 33\020platalyticsrealm\242\0360\034\240\003\002\001\000\241\0250\023\033\00= 2nn\033\rhadoop-master\243\202\001*0\202\001&\240\003\002\001\020\241\003\0= 02\001\004\242\202\001\030\004\202\001\024%V)\276.\252PU{,\\\254\272d-M\215= \234q\261\032\231\024\201L\230\200\262e\206l7ag1\321o\366\347z\363\222\245\= 232\360\272\245\276\034\025 \024\205~\260z\201=3D\234\266\000\200C\000*\f\211j\261I\357\'\371\227\241>\= \\200\267\rIl\340\243s\274\302i\256\222\210&\305\332\375n\253U\367`\320~:\2= 45]xN?=3D\226Dk\271\217\352\330N\272p\363\\%N\355\255\342v\t\3776\330m\244\= "\303\223\020\004\004\362l\324\004\311\251\024\225G\026\272boX_O\2168#\245\= \\035X\370\325\207#=3D \v\247\276\030%\037\361{LT\354\275\246\324\005L\202\003d\375ZN$\330q\325Z\0= 25L.\343\022\210\031\031\t\301\3711\235n\316\324oz \366\202\273\366(\321\355\243Ti\001\236\244L@ \342\340\374\3655D\301%\214P\037\300\001\035\300c\245E\270V\337\367\370\312= \206\213\226\f\\I\352\360\251p\234.\0166WeG\227\t\214$\361\000\244\201\3320= \201\327\240\003\002\001\020\242\201\317\004\201\314\361\366\276:\247\300\0= 32\004\320r\336W\224\321\376\026~\350\tr\327\203T\263\034\230Y6\206x\022\24= 5\002\343\266\214\306\203\265\311|S\243\311y\257\310\270\032\263\262\246~\0= 02\032\245\234A\352\b\207\250\345\321\016\355i\\\3123c$\310K\341W\325\303\2= 579\n\336\366\237c;Dy[)\367\232\260.\213\034\357J\v\331:Uu\3058\267\\P\021\= 016t\276W\334p0\335\257\0245\227\034\024\021pF\375\371\214\024`\3365\330\33= 4\201\206\3071\037\370je-\267\212\357\362a!\000,\215O:I\036$\200\372V\320-\= 016R\256)+jJ\307\026\217\265\330\354A\030\0034\362\330\224y\202\310\r\342\0= 20r9\205\271\367\273T\\q!I#\245J\320" auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "hadoop-master" } 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message state: CHALLENGE token: "`l\006\t*\206H\206\367\022\001\002\002\002\000o]0[\240\003\002\001\005\241= \003\002\001\017\242O0M\240\003\002\001\020\242F\004D\337\316\251\336\365\2= 61O@\377 \"\035\203\002\357Z\231e\332\357\364\204>d\325\"\340\263\2302\031\277\023G\= 342=3D\355\334)\303\271\t\376\252\225\207\033\000\243\332\252\335{\"\033\02= 5 \fW\225\300\375\272\201\367\216\371\273" Entered Krb5Context.initSecContext with state=3DSTATE_IN_PROCESS >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType Krb5Context setting peerSeqNumber to: 766454664 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state: RESPONSE token: "" 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message state: CHALLENGE token: "`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\377\272 \237\354\300\003\367{\207A\267\371\245\327\374\333\021\026\375}\353\035\254= \327\305\272\373\305\365L\022\374.A\203\002\001\001\000\000\004\004\004\004= " Krb5Context.unwrap: token=3D[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 0= 4 00 ff ff ff ff ba 20 9f ec c0 03 f7 7b 87 41 b7 f9 a5 d7 fc db 11 16 fd 7d eb 1d ac d7 c5 ba fb c5 f5 4c 12 fc 2e 41 83 02 01 01 00 00 04 04 04 04 ] Krb5Context.unwrap: data=3D[01 01 00 00 ] Krb5Context.wrap: data=3D[01 01 00 00 ] Krb5Context.wrap: token=3D[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04 = 00 ff ff ff ff 33 b9 e5 96 b6 c8 d3 80 4f 8a a1 5b 44 c9 b6 76 ea fe ec 80 be 37 12 e1 04 cc e5 0f 2a f8 16 1b 9e 72 17 dc 01 01 00 00 04 04 04 04 ] 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state: RESPONSE token: "`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\3773\27= 1\345\226\266\310\323\200O\212\241[D\311\266v\352\376\354\200\2767\022\341\= 004\314\345\017*\370\026\033\236r\027\334\001\001\000\000\004\004\004\004" 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message state: SUCCESS 16/08/02 18:34:13 DEBUG ipc.Client: Negotiated QOP is :auth 16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: starting, having connections 1 16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #0 16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #0 16/08/02 18:34:13 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 594ms 16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- / 192.168.23.206:8020: getFileInfo {fs { fileType: IS_DIR path: "" length: 0 permission { perm: 493 } owner: "hdfs" group: "supergroup" modification_time: 1470131070337 access_time: 0 block_replication: 0 blocksize: 0 fileId: 16385 childrenNum: 1 storagePolicy: 0 }} 16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Call -> / 192.168.23.206:8020: getListing {src: "/" startAfter: "" needLocation: false} 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #1 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #1 16/08/02 18:34:14 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 7ms 16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- / 192.168.23.206:8020: getListing {dirList { partialListing { fileType: IS_DIR path: "ranger" length: 0 permission { perm: 493 } owner: "hdfs" group: "supergroup" modification_time: 1470131070364 access_time: 0 block_replication: 0 blocksize: 0 fileId: 16386 childrenNum: 1 storagePolicy: 0 } remainingEntries: 0 }} *Found 1 items* *drwxr-xr-x - hdfs supergroup 0 2016-08-02 14:44 /ranger* 16/08/02 18:34:14 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@5e0df7af 16/08/02 18:34:14 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@5e0df7af 16/08/02 18:34:14 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@5e0df7af 16/08/02 18:34:14 DEBUG ipc.Client: Stopping client 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: closed 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to / 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: stopped, remaining connections 0 On Tue, Aug 2, 2016 at 9:03 PM, Dima Spivak wrote: > Hm, not sure what to say. The error seems to be pointing at not having a > TGT... > > -Dima > > On Tue, Aug 2, 2016 at 12:45 AM, Aneela Saleem > wrote: > > > Yes, I have kinit'd as the service user. But still getting error > > > > On Tue, Aug 2, 2016 at 3:05 AM, Dima Spivak > wrote: > > > > > The stacktrace suggests you don't have a ticket-granting ticket. Have > you > > > kinit'd as the service user? > > > > > > -Dima > > > > > > On Sun, Jul 31, 2016 at 11:19 PM, Aneela Saleem < > aneela@platalytics.com> > > > wrote: > > > > > > > Hi Dima, > > > > > > > > I followed the official reference guide now, but still same error. > > > > Attached is the hbase-site.xml file, please have a look. What's wro= ng > > > there? > > > > > > > > On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak > > > > wrote: > > > > > > > >> I haven't looked in detail at your hbase-site.xml, but if you're > > running > > > >> Apache HBase (and not a CDH release), I might recommend using the > > > official > > > >> reference guide [1] to configure your cluster instead of the CDH > 4.2.0 > > > >> docs > > > >> since those would correspond to HBase 0.94, and might well have > > > different > > > >> steps required to set up security. If you are trying out CDH HBase= , > be > > > >> sure > > > >> to use up-to-date documentation for your release. > > > >> > > > >> Let us know how it goes. > > > >> > > > >> [1] https://hbase.apache.org/book.html#hbase.secure.configuration > > > >> > > > >> -Dima > > > >> > > > >> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem < > > aneela@platalytics.com > > > > > > > >> wrote: > > > >> > > > >> > Hi Dima, > > > >> > > > > >> > I'm running Hbase version 1.2.2 > > > >> > > > > >> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak < > dspivak@cloudera.com> > > > >> wrote: > > > >> > > > > >> > > Hi Aneela, > > > >> > > > > > >> > > What version of HBase are you running? > > > >> > > > > > >> > > -Dima > > > >> > > > > > >> > > On Thursday, July 28, 2016, Aneela Saleem < > aneela@platalytics.com > > > > > > >> > wrote: > > > >> > > > > > >> > > > Hi, > > > >> > > > > > > >> > > > I have successfully configured Zookeeper with Kerberos > > > >> authentication. > > > >> > > Now > > > >> > > > i'm facing issue while configuring HBase with Kerberos > > > >> authentication. > > > >> > I > > > >> > > > have followed this link > > > >> > > > < > > > >> > > > > > >> > > > > >> > > > > > > http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security= -Guide/cdh4sg_topic_8_2.html > > > >> > > >. > > > >> > > > Attached are the configuration files, i.e., hbase-site.xml a= nd > > > >> > > > zk-jaas.conf. > > > >> > > > > > > >> > > > Following are the logs from regionserver: > > > >> > > > > > > >> > > > 016-07-28 17:44:56,881 WARN [regionserver/hadoop-master/ > > > >> > > > 192.168.23.206:16020] regionserver.HRegionServer: error > telling > > > >> master > > > >> > > we > > > >> > > > are up > > > >> > > > com.google.protobuf.ServiceException: java.io.IOException: > Could > > > not > > > >> > set > > > >> > > > up IO Streams to hadoop-master/192.168.23.206:16000 > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(Abstract= RpcClient.java:240) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplement= ation.callBlockingMethod(AbstractRpcClient.java:336) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$Regio= nServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusPro= tos.java:8982) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionS= erver.java:2284) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java= :906) > > > >> > > > at java.lang.Thread.run(Thread.java:745) > > > >> > > > Caused by: java.io.IOException: Could not set up IO Streams = to > > > >> > > > hadoop-master/192.168.23.206:16000 > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcCl= ientImpl.java:785) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClie= ntImpl.java:906) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(R= pcClientImpl.java:873) > > > >> > > > at > > > >> > > > > > >> > > org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(Abstract= RpcClient.java:227) > > > >> > > > ... 5 more > > > >> > > > Caused by: java.lang.RuntimeException: SASL authentication > > failed. > > > >> The > > > >> > > > most likely cause is missing or invalid credentials. Conside= r > > > >> 'kinit'. > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.= java:685) > > > >> > > > at java.security.AccessController.doPrivileged(Native Method= ) > > > >> > > > at javax.security.auth.Subject.doAs(Subject.java:415) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1614) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnection= Failure(RpcClientImpl.java:643) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcCl= ientImpl.java:751) > > > >> > > > ... 9 more > > > >> > > > Caused by: javax.security.sasl.SaslException: GSS initiate > > failed > > > >> > [Caused > > > >> > > > by GSSException: No valid credentials provided (Mechanism > level: > > > >> Failed > > > >> > > to > > > >> > > > find any Kerberos tgt)] > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Clie= nt.java:212) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSasl= RpcClient.java:179) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(= RpcClientImpl.java:617) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClient= Impl.java:162) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.= java:743) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.= java:740) > > > >> > > > at java.security.AccessController.doPrivileged(Native Method= ) > > > >> > > > at javax.security.auth.Subject.doAs(Subject.java:415) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1614) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcCl= ientImpl.java:740) > > > >> > > > ... 9 more > > > >> > > > Caused by: GSSException: No valid credentials provided > > (Mechanism > > > >> > level: > > > >> > > > Failed to find any Kerberos tgt) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.= java:147) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFacto= ry.java:121) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactor= y.java:187) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:= 223) > > > >> > > > at > > > >> > > > > > >> > > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212= ) > > > >> > > > at > > > >> > > > > > >> > > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179= ) > > > >> > > > at > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Clie= nt.java:193) > > > >> > > > > > > >> > > > > > > >> > > > Please have a look, whats going wrong here? > > > >> > > > > > > >> > > > Thanks > > > >> > > > > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > > > > > > > > > > > > -- > > > -Dima > > > > > > > > > -- > -Dima > --94eb2c03f9926f9720053918f8a8--