hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "K. N. Ramachandran" <knra...@gmail.com>
Subject Re: Hadoop Kerberos - Authentication issue IPC Server/Client
Date Mon, 21 Mar 2016 22:05:44 GMT
Hi,

I was able to narrow down the issue further. The way I was setting up the
Kerberos principals was different and I have modified it now.

Now both the server and the client have the same UGI and are authenticated
with Kerberos (hasKerberosCredentials() returns True). But on the server
side, I now seem to get a

16/03/21 16:14:50 DEBUG security.UserGroupInformation:
PrivilegedActionException as:ram/ram-virtualbox@RAM-VIRTUALBOX
(auth:KERBEROS) cause:javax.security.sasl.SaslException: Failure to
initialize security context [Caused by GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos credentails)]


I have observed this error earlier when I attempt to submit jobs with a
TGT. But I can confirm that I have TGTs available for all relevant
principals (ram/ram-vbox/RAM-VBOX principal, hdfs and yarn principals too).

I am not clear what could still be the source of this error. Can anyone
give any further suggestions on where I should look?


On Fri, Mar 18, 2016 at 5:57 PM, K. N. Ramachandran <knram06@gmail.com>
wrote:

> Hi,
>
> I have a Kerberos setup with Hadoop (single node cluster) in an Ubuntu
> environment (VirtualBox setup).
>
> We are using a variant of a Yarn application and the Client.java in this
> variant opens a socket for communicating to the ApplicationMaster and
> receiving messages.
>
> Without Kerberos, this works fine. I am currently investigating whether
> the entire structure will work with Kerberos too and what code changes
> would be necessary. With Kerberos, a problem occurs at the socket
> connection part and simply fails with errors outlined in the attached file
> (kerbFailure.txt), a snippet of the errors is:
> 16/03/18 17:18:28 WARN ipc.Client: Exception encountered while connecting
> to the server : org.apache.hadoop.security.AccessControlException: Client
> cannot authenticate via:[KERBEROS]
>
> Now I have enabled Kerberos authentication on the Hadoop cluster by
> following the instructions at:
>
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-3-0/CDH4-Security-Guide/cdh4sg_topic_3.html
>
> Since the stacktrace has references to SASL connection methods, should I
> explicitly enable SASL authentication, following the instructions at:
>
> https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/SecureMode.html
> ?
>
> My impression was that SASL DataTransfer is optional (only needed if I
> want to start Nodes as non-root) and I currently start up the Secure Data
> Node as root and set JSVC_HOME, using the scripts in the sbin folder.
>
> I can also verify that both client and server processes return the correct
> Kerberos principal when I do:
>
> UserGroupInformation ugi = UserGroupInformation.getCurrentUser();
> LOG.info("UGI: " + ugi + ", hasKerb: " + ugi.hasKerberosCredentials());
> // outputs: UGI: ram@RAM-VIRTUALBOX (auth:KERBEROS), hasKerb: true
>
> I have hdfs and yarn as separate users. Both have their relevant Kerberos
> principals and authenticated through keytabs. My username is added as a
> principal too and authenticated with a password. So system startup and Yarn
> job submission is fine, but I encounter the error at socket connection as
> described before.
>
> Hope this overview helps. Please let me know if you might need more
> information.
>
> Thanking You,
> K.N.Ramachandran
>


Thanking You,
K.N.Ramachandran

Mime
View raw message