falcon-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arpit Gupta <ar...@hortonworks.com>
Subject Re: Replication Job throws GSSException
Date Thu, 10 Jul 2014 21:16:22 GMT
You need to provide the nn principal in the cluster.xml for each cluster. The following property
needs to be provided in each cluster's xml

dfs.namenode.kerberos.principal
--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Jul 10, 2014, at 2:08 PM, Venkat R <veramacha@yahoo.com> wrote:

> Using the demo example. There is a replication job that copies dataset from Source to
Target cluster by launching a REPLICATION job on Target Oozie cluster. But it fails with the
following GSSException:
> 
> I have added both the oozie servers (one for source and target clusters) to the core-site.xml
of both the clusters as proxyuser machines as below:
> 
> source-cluster and target-cluster : core-site.xml has the following: 
> 
>   <property>
>     <name>hadoop.proxyuser.oozie.groups</name>
>     <value>users</value>
>   </property>
>   <property>
>     <name>hadoop.proxyuser.oozie.hosts</name>
>     <value>eat1-hcl0758.grid.linkedin.com,eat1-hcl0759.grid.linkedin.com</value>
>   </property>
> 
> Appreciate any pointers.
> Venkat
> 
> Failing Oozie Launcher, Main class [org.apache.falcon.latedata.LateDataHandler], main()
threw exception, Unable to obtain remote token
> java.io.IOException: Unable to obtain remote token
> 	at org.apache.hadoop.hdfs.tools.DelegationTokenFetcher.getDTfromRemote(DelegationTokenFetcher.java:249)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem$2.run(HftpFileSystem.java:251)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem$2.run(HftpFileSystem.java:246)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem.getDelegationToken(HftpFileSystem.java:246)
> 	at org.apache.hadoop.hdfs.web.TokenAspect.ensureTokenInitialized(TokenAspect.java:143)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem.addDelegationTokenParam(HftpFileSystem.java:336)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem.openConnection(HftpFileSystem.java:323)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem$LsParser.fetchList(HftpFileSystem.java:455)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem$LsParser.getFileStatus(HftpFileSystem.java:470)
> 	at org.apache.hadoop.hdfs.web.HftpFileSystem.getFileStatus(HftpFileSystem.java:499)
> 	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
> 	at org.apache.hadoop.fs.Globber.glob(Globber.java:238)
> 	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1624)
> 	at org.apache.falcon.latedata.LateDataHandler.usage(LateDataHandler.java:269)
> 	at org.apache.falcon.latedata.LateDataHandler.getFileSystemUsageMetric(LateDataHandler.java:252)
> 	at org.apache.falcon.latedata.LateDataHandler.computeStorageMetric(LateDataHandler.java:224)
> 	at org.apache.falcon.latedata.LateDataHandler.computeMetrics(LateDataHandler.java:170)
> 	at org.apache.falcon.latedata.LateDataHandler.run(LateDataHandler.java:147)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.falcon.latedata.LateDataHandler.main(LateDataHandler.java:60)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
> 	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException:
GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos
tgt)
> 	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:306)
> 	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:196)
> 	at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:232)
> 	at org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:164)
> 	at org.apache.hadoop.hdfs.tools.DelegationTokenFetcher.run(DelegationTokenFetcher.java:371)
> 	at org.apache.hadoop.hdfs.tools.DelegationTokenFetcher.getDTfromRemote(DelegationTokenFetcher.java:238)
> 	... 35 more
> Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find
any Kerberos tgt)
> 	at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> 	at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> 	at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> 	at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> 	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> 	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> 	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:285)
> 	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:261)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:261)
> 	... 40 more
> 
> Oozie Launcher failed, finishing Hadoop job gracefully
> 


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message