hadoop-yarn-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Liu, David" <liujion...@gmail.com>
Subject "SIMPLE authentication is not enabled" error for secured hdfs read
Date Tue, 24 Jun 2014 13:29:48 GMT
Hi experts,

After kinit hadoop, When I run this java file on a secured hadoop cluster, I met the following
error:
14/06/24 16:53:41 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs
(auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate
via:[TOKEN, KERBEROS]
14/06/24 16:53:41 WARN ipc.Client: Exception encountered while connecting to the server :
org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN,
KERBEROS]
14/06/24 16:53:41 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs
(auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.security.AccessControlException:
Client cannot authenticate via:[TOKEN, KERBEROS]
14/06/24 16:53:41 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs
(auth:SIMPLE) cause:java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException:
Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "hdsh2-a161/10.62.66.161";
destination host is: "hdsh2-a161.lss.emc.com":8020; 
Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException:
org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN,
KERBEROS]; Host Details : local host is: "hdsh2-a161/10.62.66.161"; destination host is: "hdsh2-a161.lss.emc.com":8020;

	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.Client.call(Client.java:1300)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
	at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:191)
	at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1067)
	at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1057)
	at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1047)
	at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:235)
	at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:202)
	at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:195)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1215)
	at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:290)
	at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:286)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:286)
	at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:763)
	at Testhdfs$1.run(Testhdfs.java:43)
	at Testhdfs$1.run(Testhdfs.java:30)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
	at Testhdfs.main(Testhdfs.java:30)


Here is my code:

UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hadoop");
		ugi.doAs(new PrivilegedExceptionAction<Void>() {
			public Void run() throws Exception {
				Configuration conf = new Configuration();
				FileSystem fs = FileSystem.get(URI.create(uri), conf);
				FSDataInputStream in = fs.open(new Path(uri));
				IOUtils.copy(in, System.out, 4096);
				return null;
			}
		});

But when I run it without UserGroupInformation, like this on the same cluster with the same
user, the code works fine.
Configuration conf = new Configuration();
				FileSystem fs = FileSystem.get(URI.create(uri), conf);
				FSDataInputStream in = fs.open(new Path(uri));
				IOUtils.copy(in, System.out, 4096);

Could anyone help me?

Thanks
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message