hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Zheng, Kai" <kai.zh...@intel.com>
Subject RE: Issue with WebHDFS Kerberos authentication, when running with JVM-1.7.0-SR1 or above
Date Thu, 07 Jan 2016 13:12:57 GMT
I roughly checked your logs. It’s in full details and something should be out.

There was a line that said “error Message is Server not found in Kerberos database”, and
the service name is “HTTP@isodvm132.in.ibm.com<mailto:HTTP@isodvm132.in.ibm.com>”,
so that’s it, your service name isn’t correct, and no key won’t be able to be picked
up from your keytab so it complains “no credential”.

I guess the HTTP service principal should be something like “HTTP/YOUR-FQDN@YOUR-REALM”.
Please check it in the KDC database and your Hadoop configuration.

Note your krb5.conf looks fine.

Regards,
Kai

From: Ravi Tatapudi [mailto:ravi_tatapudi@in.ibm.com]
Sent: Thursday, January 07, 2016 7:26 PM
To: Zheng, Kai <kai.zheng@intel.com>
Cc: user@hadoop.apache.org
Subject: RE: Issue with WebHDFS Kerberos authentication, when running with JVM-1.7.0-SR1 or
above

Hello Kai:

Many thanks for the info. I have executed the tests with the settings: "-Dcom.ibm.security.jgss.debug=all
-Dcom.ibm.security.krb5.Krb5Debug=all", using IBM-JDK-17GA & IBM-JDK-SR1. When I compare
the outputs, I see the following error in the failed case (the one run with JDK-1.7SR1), indicating
that "No realm from cfg domain-realm mapping for hostname..." (is it something to do with
entries in "krb5.conf")?

===============================================================================================
324 [JGSS_DBG_CRED]  main Cannonicalizing hostbased service name HTTP@isodvm132.in.ibm.com<mailto:HTTP@isodvm132.in.ibm.com>
325 [JGSS_DBG_CRED]  main Hostname isodvm132.in.ibm.com
326 [JGSS_DBG_CRED]   No realm from cfg domain-realm mapping for hostname
327 [JGSS_DBG_CRED]   Domain .in.ibm.com
328 [JGSS_DBG_CRED]   No realm from cfg domain-realm mapping for domain; making realm from
domain (suffix part)
329 [JGSS_DBG_CRED]  main Realm IN.IBM.COM
===============================================================================================

Pl. find attached Zip-file having the complete test-outputs + the "krb5.conf" being used for
these tests. Could you please see the same & let me know your thoughts.



Thanks,
 Ravi



From:        "Zheng, Kai" <kai.zheng@intel.com<mailto:kai.zheng@intel.com>>
To:        "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Date:        01/07/2016 12:19 PM
Subject:        RE: Issue with WebHDFS Kerberos authentication, when running with JVM-1.7.0-SR1
or above
________________________________



It looks like in the JDK versions in question, your client credential TGT isn’t able to
authenticate to the WebHDFS server, or the TGT doesn’t match with any keytab key configured
for HTTP service principal. You may inspect your keytab file closely, checking the key version,
encryption types and so on. More logs may be helpful, for the IBM JDK, I guess you can try
the following.

-Dcom.ibm.security.jgss.debug=all -Dcom.ibm.security.krb5.Krb5Debug=all

Regards,
Kai

From:Ravi Tatapudi [mailto:ravi_tatapudi@in.ibm.com]
Sent: Wednesday, January 06, 2016 8:10 PM
To: user@hadoop.apache.org<mailto:user@hadoop.apache.org>
Subject: Issue with WebHDFS Kerberos authentication, when running with JVM-1.7.0-SR1 or above

Hello,

Pl. find attached text-file, having the sample-code for deleting a file on WebHDFS, with "kerberos
security" enabled.



I see that, this code is working successfully when run using "JDK-1.7-GA" (i.e., with "J9VM
- R27_Java727_GA_20131114_0833_B175264", as seen from the output of the command: "java -version").
However, when the same code is run using the JDK-1.7-SR1 or above (i.e., with "J9VM - R27_Java727_SR1_20140410_1931_B195893",
as seen from the output of the command: "java -version"), it is failing with the exception
shown in the attached text-file below:



I also tried using the JDK-1.7-SR3.fp20 ("J9VM - R27_Java727_SR3_20151022_1530_B273253") &
got the same exception, that I am seeing with JDK-1.7-SR1. I have executed all these tests,
using the same set of Hadoop/Hive-jars, the only change being the JDK-version.

Could you please see the same & provide your inputs / suggestions on how to proceed further
with this issue.

Thanks,
Ravi
Mime
View raw message