hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Lilley <john.lil...@redpoint.net>
Subject Issue with Hadoop/Kerberos security as client
Date Tue, 19 Aug 2014 23:36:08 GMT
We are encountering a really strange issue accessing Hadoop securely as a client.  We go through
the motions of calling setting the security configuration:

    YarnConfiguration conf = new YarnConfiguration();
    conf.set(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY, nnPrincipal);
    conf.set(YarnConfiguration.RM_PRINCIPAL, rmPrincipal);
    conf.set(CommonConfigurationKeysPublic.HADOOP_SECURITY_AUTHENTICATION, "kerberos");
    conf.set(CommonConfigurationKeysPublic.HADOOP_SECURITY_AUTHORIZATION, "true");
    UserGroupInformation.setConfiguration(conf);

And then log in a user with password:

    Map<String,String> krbOptions = new HashMap<String,String>();
    krbOptions.put("doNotPrompt", "false");
    krbOptions.put("useTicketCache", "false");
    krbOptions.put("useKeyTab", "false");
    krbOptions.put("renewTGT", "false");
    AppConfigurationEntry ace = new AppConfigurationEntry(
        KerberosUtil.getKrb5LoginModuleName(), LoginModuleControlFlag.REQUIRED,
        krbOptions);
    DynamicConfiguration dynConf = new DynamicConfiguration(
        new AppConfigurationEntry[] {ace});

    LoginContext loginContext = newLoginContext(USER_PASSWORD_LOGIN_KERBEROS_CONFIG_NAME,
      null, new LoginHandler(principal, password), dynConf);
    loginContext.login();
    Subject loginSubject = loginContext.getSubject();
    Set<Principal> loginPrincipals = loginSubject.getPrincipals();
    if (loginPrincipals.isEmpty()) {
      throw new LoginException("No login principals in loginSubject: " + loginSubject);
    }
    String username = loginPrincipals.iterator().next().getName();
    Principal ugiUser = newUser(username, AuthenticationMethod.KERBEROS, loginContext);
    loginSubject.getPrincipals().add(ugiUser);
    UserGroupInformation loginUser = newUserGroupInformation(loginSubject);
    UserGroupInformation.setLoginUser(loginUser);
    setUGILogin(loginUser, loginContext); // do loginUser.setLogin(loginContext)
    loginUser.setAuthenticationMethod(AuthenticationMethod.KERBEROS);


This all works fine.  In fact, if this is called from the same thread as the thread that later
makes calls into HDFS and YARN APIs, everything is great.  We can read/write HDFS, create
and launch an ApplicationMaster, and so on.  This works with C++ code calling through JNI.

The problem occurs when the initialization happens in a different thread than the later HDFS/YARN
access.  The first attempt to create an HDFS file system fails with the following stack trace:

Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException:
Couldn't set up IO streams; Host D
etails : local host is: "soad/192.168.57.232"; destination host is: "rpb-cds-cent6-01.office.datalever.com":8020;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
        at org.apache.hadoop.ipc.Client.call(Client.java:1351)
        at org.apache.hadoop.ipc.Client.call(Client.java:1300)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
        at com.sun.proxy.$Proxy9.create(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
        at com.sun.proxy.$Proxy9.create(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:227
)
        at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1389)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1382)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1307)
        at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:384)
        at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:380)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:380)
        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:324)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
Caused by: java.io.IOException: Couldn't set up IO streams
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:711)
        at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
        at org.apache.hadoop.ipc.Client.call(Client.java:1318)
        ... 21 more
Caused by: java.util.ServiceConfigurationError: org.apache.hadoop.security.SecurityInfo: Provider
org.apache.hadoop.security.AnnotatedSecurityInfo not found
        at java.util.ServiceLoader.fail(Unknown Source)
        at java.util.ServiceLoader.access$300(Unknown Source)
        at java.util.ServiceLoader$LazyIterator.next(Unknown Source)
        at java.util.ServiceLoader$1.next(Unknown Source)
        at org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:371)
        at org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:260)
        at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:216)
        at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:157)
        at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:387)
        at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
        at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
        ... 24 more


Oddly, the multi-thread access pattern works in pure Java, it just fails when performed from
C++ via JNI.  We are very careful to maintain global JNI references etc... the JNI interface
works flawlessly in all other cases.  Any ideas?

Thanks
John


Mime
View raw message