hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kumar r <kumarc...@gmail.com>
Subject Re: Encryption type AES256 CTS mode with HMAC SHA1-96 is notsupported/enabled
Date Mon, 24 Oct 2016 07:12:12 GMT
Hi,

If i installed policy files then it shows,



*GSSException: Failure unspecified at GSS-API level (Mechanism level:
Specified version of key is not available (44))*


But without installing Policy files itself, it works fine with local
Windows Active Directory.

Thanks,



On Mon, Oct 24, 2016 at 12:28 PM, <wget.null@gmail.com> wrote:

> Looks like the strong encryption policy file for Java (Oracle) isn’t
> installed. Or you don’t have a valid Kerberos ticket in your cache (klist).
>
>
>
> --
> B: mapredit.blogspot.com
>
>
>
> *From: *kumar r <kumarccpp@gmail.com>
> *Sent: *Monday, October 24, 2016 8:49 AM
> *To: *user@hadoop.apache.org
> *Subject: *Encryption type AES256 CTS mode with HMAC SHA1-96 is
> notsupported/enabled
>
>
>
> Hi,
>
> I am trying to configure hadoop pseudo node secure cluster (to ensure
> proper working) in Azure using Azure Domain Service.
>
> OS - Windows Server 2012 R2 Datacenter
>
> Hadoop Version - 2.7.2
>
> I can able to run
>
> *hadoop fs -ls /*
>
> Example MapReduce job works fine
> *yarn jar
> %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar pi 16
> 10000*
>
>
>
> But when i run,
>
> *hdfs fsck /*
>
> it gives,
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Connecting to namenode via https://node1:50470/fsck?ugi=Kumar&path=%2F
> <https://node1:50470/fsck?ugi=Kumar&path=%2F>Exception in thread "main"
> java.io.IOException:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> Authentication failed, status: 403, message: GSSException: No valid
> credentials provided (Mechanism level: Failed to find any Kerberos
> credentails)        at
> org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335)        at
> org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73)        at
> org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152)        at
> org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149)        at
> java.security.AccessController.doPrivileged(Native Method)        at
> javax.security.auth.Subject.doAs(Subject.java:415)        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148)        at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)        at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)        at
> org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)Caused by:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> Authentication failed, status: 403, message: GSSException: No valid
> credentials provided (Mechanism level: Failed to find any Kerberos
> credentails)        at
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
> at
> org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
> at
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
> at
> org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161)
> at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333)        ... 10
> more*
>
> When i access namenode web ui, it shows
>
>
>
>
> *GSSException: Failure unspecified at GSS-API level (Mechanism level: Encryption type
AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)**[image: Inline image 1]*
>
> Someone help me to resolve this error and get it work successfully.
>
>
>

Mime
View raw message