Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id E7514200ACA for ; Thu, 9 Jun 2016 22:38:46 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id E5F50160A58; Thu, 9 Jun 2016 20:38:46 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id BB507160A29 for ; Thu, 9 Jun 2016 22:38:45 +0200 (CEST) Received: (qmail 5120 invoked by uid 500); 9 Jun 2016 20:38:43 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 5110 invoked by uid 99); 9 Jun 2016 20:38:43 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 09 Jun 2016 20:38:43 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 122AB18052A for ; Thu, 9 Jun 2016 20:38:43 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.798 X-Spam-Level: ** X-Spam-Status: No, score=2.798 tagged_above=-999 required=6.31 tests=[FSL_HELO_BARE_IP_2=1.499, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, SPF_PASS=-0.001] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id OqEq0QoCbVBW for ; Thu, 9 Jun 2016 20:38:39 +0000 (UTC) Received: from relayvx11b.securemail.intermedia.net (relayvx11b.securemail.intermedia.net [64.78.52.184]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id 444A15FAC6 for ; Thu, 9 Jun 2016 20:38:39 +0000 (UTC) Received: from securemail.intermedia.net (localhost [127.0.0.1]) (using TLSv1 with cipher AES256-SHA (256/256 bits)) (No client certificate requested) by emg-ca-1-1.localdomain (Postfix) with ESMTPS id CB28C53E66; Thu, 9 Jun 2016 13:38:31 -0700 (PDT) Subject: Re: SaslException: No common protection layer between client and server ? MIME-Version: 1.0 x-echoworx-msg-id: 92bacf45-637e-47af-be7e-15249119b023 x-echoworx-emg-received: Thu, 9 Jun 2016 13:38:31.769 -0700 x-echoworx-message-code-hashed: c919c485894ddeb584d9f462a61613702d866109fad29efbe6c029bfae4634b6 x-echoworx-action: delivered Received: from 10.254.155.14 ([10.254.155.14]) by emg-ca-1-1 (JAMES SMTP Server 2.3.2) with SMTP ID 711; Thu, 9 Jun 2016 13:38:31 -0700 (PDT) Received: from MBX080-W4-CO-1.exch080.serverpod.net (unknown [10.224.117.101]) (using TLSv1 with cipher AES256-SHA (256/256 bits)) (No client certificate requested) by emg-ca-1-1.localdomain (Postfix) with ESMTPS id 774CC53E66; Thu, 9 Jun 2016 13:38:31 -0700 (PDT) Received: from MBX080-W4-CO-2.exch080.serverpod.net (10.224.117.102) by MBX080-W4-CO-1.exch080.serverpod.net (10.224.117.101) with Microsoft SMTP Server (TLS) id 15.0.1178.4; Thu, 9 Jun 2016 13:38:30 -0700 Received: from MBX080-W4-CO-2.exch080.serverpod.net ([10.224.117.102]) by mbx080-w4-co-2.exch080.serverpod.net ([10.224.117.102]) with mapi id 15.00.1178.000; Thu, 9 Jun 2016 13:38:30 -0700 From: Chris Nauroth To: Dmitry Goldenberg , "user@hadoop.apache.org" Thread-Topic: SaslException: No common protection layer between client and server ? Thread-Index: AQHRwd3BpbZeaPdWbEWIYbFwrGWV+Z/hmgSA Date: Thu, 9 Jun 2016 20:38:29 +0000 Message-ID: References: In-Reply-To: Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-ms-exchange-messagesentrepresentingtype: 1 x-ms-exchange-transport-fromentityheader: Hosted x-originating-ip: [73.109.95.87] x-source-routing-agent: Processed Content-Type: multipart/alternative; boundary="_000_D37F20D246158cnaurothhortonworkscom_" archived-at: Thu, 09 Jun 2016 20:38:47 -0000 --_000_D37F20D246158cnaurothhortonworkscom_ Content-Type: text/plain; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable Hello Dmitry, For a successful SASL authentication, both the client and server need to ne= gotiate and come to agreement on a quality of protection. The Hadoop confi= guration property hadoop.rpc.protection supports a comma-separate list of v= alues that map to the SASL QoP values. hadoop.rpc.protection authentication A comma-separated list of protection values for secured sasl connections. Possible values are authentication, integrity and privac= y. authentication means authentication only and no integrity or privacy; integrity implies authentication and integrity are enabled; and priva= cy implies all of authentication, integrity and privacy are enabled. hadoop.security.saslproperties.resolver.class can be used to override the hadoop.rpc.protection for a connection at the server side. This means that for this application to work, both the client end and the s= erver end must have at least one value in common in that list. Judging fro= m the stack trace, it looks like the client is trying to communicate with t= he HDFS NameNode, so I recommend that you review the setting for this confi= guration property at the NameNode and the application. If you change the c= onfiguration property for the NameNode, then it will require a restart for = the change to take effect. If this still isn't working, then it's possible that Java's SASL debugging = might provide more clues. Details on how to enable that logging are here: https://docs.oracle.com/javase/8/docs/technotes/guides/security/sasl/sasl-r= efguide.html#DEBUG --Chris Nauroth From: Dmitry Goldenberg > Date: Wednesday, June 8, 2016 at 4:30 PM To: "user@hadoop.apache.org" > Subject: SaslException: No common protection layer between client and serve= r ? I'm trying to communicate programmatically to a Hadoop cluster which is ker= berized (CDH 5.3/HDFS 2.5.0). I have a valid Kerberos token on the client side. But I'm getting an error= as below, "No common protection layer between client and server". What does this error mean and are there any ways to fix or work around it? Is this something related to HDFS-5688? The ticket seems to imply that the property "hadoop.rpc.prote= ction" must be set, presumably to "authentication" (also per e.g. this). Would this need to be set on all servers in the cluster and then the cluste= r bounced? I don't have easy access to the cluster so I need to understand= whether 'hadoop.rpc.protection' is the actual cause. It seems that 'authen= tication' should be the value used by default, at least according to the co= re-default.xml documentation. java.io.IOException: Failed on local exception: java.io.IOException: Couldn= 't setup connection for principal1/server1.acme.net@xxx.acme.net to server2.acme.net/10.= XX.XXX.XXX:8020; = Host Details : local host is: =93some-host.acme.net/168.XX.XXX.XX=94; destination host is: =93server= 2.acme.net=94:8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764) at org.apache.hadoop.ipc.Client.call(Client.java:1415) at org.apache.hadoop.ipc.Client.call(Client.java:1364) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufR= pcEngine.java:206) at com.sun.proxy.$Proxy24.getFileInfo(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessor= Impl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethod= AccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(R= etryInvocationHandler.java:187) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryIn= vocationHandler.java:102) at com.sun.proxy.$Proxy24.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslat= orPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785= ) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(Distribut= edFileSystem.java:1068) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(Distribut= edFileSystem.java:1064) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLi= nkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(Distr= ibutedFileSystem.java:1064) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398) ... 11 more Caused by: java.io.IOException: Couldn't setup connection for principal1/se= rver1.acme.net@xxx.acme.net to server2.acme.net/10.XX.XXX.XXX:8020; at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:671) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupIn= formation.java:1614) at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFail= ure(Client.java:642) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.ja= va:725) at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:= 367) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463) at org.apache.hadoop.ipc.Client.call(Client.java:1382) ... 31 more Caused by: javax.security.sasl.SaslException: No common protection layer be= tween client and server at com.sun.security.sasl.gsskerb.GssKrb5Client.doFinalHandshake(Gss= Krb5Client.java:251) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Gs= sKrb5Client.java:186) at org.apache.hadoop.security.SaslRpcClient.saslEvaluateToken(SaslR= pcClient.java:483) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClie= nt.java:427) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Clie= nt.java:552) at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:= 367) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:717) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:713) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupIn= formation.java:1614) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.ja= va:712) ... 34 more --_000_D37F20D246158cnaurothhortonworkscom_ Content-Type: text/html; charset="Windows-1252" Content-ID: Content-Transfer-Encoding: quoted-printable
Hello Dmitry,

For a successful SASL authentication, both the client and server need = to negotiate and come to agreement on a quality of protection.  The Ha= doop configuration property hadoop.rpc.protection supports a comma-separate= list of values that map to the SASL QoP values.

<property>
  <name>hadoop.rpc.protection</name>
  <value>authentication</value>
  <description>A comma-separated list of protection values = for secured sasl 
      connections. Possible values are authentication, = integrity and privacy.
      authentication means authentication only and no i= ntegrity or privacy; 
      integrity implies authentication and integrity ar= e enabled; and privacy 
      implies all of authentication, integrity and priv= acy are enabled.
      hadoop.security.saslproperties.resolver.class can= be used to override
      the hadoop.rpc.protection for a connection at the= server side.
  </description>
</property>

This means that for this application to work, both the client end and = the server end must have at least one value in common in that list.  J= udging from the stack trace, it looks like the client is trying to communic= ate with the HDFS NameNode, so I recommend that you review the setting for this configuration property at the NameNod= e and the application.  If you change the configuration property for t= he NameNode, then it will require a restart for the change to take effect.<= /div>

If this still isn't working, then it's possible that Java's SASL debug= ging might provide more clues.  Details on how to enable that logging = are here:


--Chris Nauroth

From: Dmitry Goldenberg <dgoldenberg123@gmail.com>
Date: Wednesday, June 8, 2016 at 4:= 30 PM
To: "user@hadoop.apache.org" <user@hadoop.apache.org>
Subject: SaslException: No common p= rotection layer between client and server ?

I'm trying to communicate programmatically to a Hadoop clu= ster which is kerberized (CDH 5.3/HDFS 2.5.0).

I have a valid Kerberos token on the client side.  But I'm gettin= g an error as below, "No common protection layer between client and se= rver".

What does this error mean and are there any ways to fix or work around= it?

Is this something related to HDFS-5688? The ticket seems to imply that the property "hadoop.rpc= .protection" must be set, presumably to "authentication" (also per e.g. this). 

Would this need to be set on all servers in the cluster and then the c= luster bounced?  I don't have easy access to the cluster so I need to = understand whether 'hadoop.rpc.protection' is the actual cause. It seems th= at 'authentication' should be the value used by default, at least according to the core-default.xml documentation.

java.io.IOException: Failed on local excepti= on: java.io.IOException: Couldn't setup connection for principal1/server1.acme.net@xxx.acme.net to server2.acme.net/10.XX.XXX.XXX:8020; Host Details= : local host is: =93some-host.acme.net/168.XX.XXX.XX=94; destination host is: =93server2.acme.net=94:8020;

        a= t org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)

        a= t org.apache.hadoop.ipc.Client.call(Client.java:1415)

        a= t org.apache.hadoop.ipc.Client.call(Client.java:1364)

        a= t org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.= java:206)

        a= t com.sun.proxy.$Proxy24.getFileInfo(Unknown Source)

        a= t sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        a= t sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :62)

        a= t sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43)

        a= t java.lang.reflect.Method.invoke(Method.java:498)

        a= t org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:187)

        a= t org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:102)

        a= t com.sun.proxy.$Proxy24.getFileInfo(Unknown Source)

        a= t org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getF= ileInfo(ClientNamenodeProtocolTranslatorPB.java:707)

        a= t org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)<= /p>

        a= t org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSys= tem.java:1068)

        a= t org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSys= tem.java:1064)

        a= t org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolve= r.java:81)

        a= t org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFil= eSystem.java:1064)

        a= t org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)

        .= .. 11 more

Caused by: java.io.IOException: Couldn't set= up connection for principal1/server1.acme.net@xxx.acme.net to server2.acme.net/10.XX.XXX.XXX:8020;

        a= t org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:671)

        a= t java.security.AccessController.doPrivileged(Native Method)

        a= t javax.security.auth.Subject.doAs(Subject.java:422)

        a= t org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1614)

        a= t org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Clien= t.java:642)

        a= t org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:725)

        a= t org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)

        a= t org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)

        a= t org.apache.hadoop.ipc.Client.call(Client.java:1382)

        .= .. 31 more

Caused by: javax.security.sasl.SaslException= : No common protection layer between client and server

        a= t com.sun.security.sasl.gsskerb.GssKrb5Client.doFinalHandshake(GssKrb5Clien= t.java:251)

        a= t com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Clie= nt.java:186)

        a= t org.apache.hadoop.security.SaslRpcClient.saslEvaluateToken(SaslRpcClient.= java:483)

        a= t org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:4= 27)

        a= t org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:5= 52)

        a= t org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:367)

        a= t org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:717)

        a= t org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:713)

        a= t java.security.AccessController.doPrivileged(Native Method)

        a= t javax.security.auth.Subject.doAs(Subject.java:422)

        a= t org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1614)

        a= t org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)

        .= .. 34 more

--_000_D37F20D246158cnaurothhortonworkscom_--