Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 4772A200B84 for ; Tue, 20 Sep 2016 10:53:04 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 45EB9160AC9; Tue, 20 Sep 2016 08:53:04 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 44173160AA9 for ; Tue, 20 Sep 2016 10:53:02 +0200 (CEST) Received: (qmail 17989 invoked by uid 500); 20 Sep 2016 08:52:59 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 17974 invoked by uid 99); 20 Sep 2016 08:52:59 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 20 Sep 2016 08:52:59 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 12BEB185CAE for ; Tue, 20 Sep 2016 08:52:59 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.432 X-Spam-Level: *** X-Spam-Status: No, score=3.432 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_ENVFROM_END_DIGIT=0.25, HTML_MESSAGE=2, HTML_OBFUSCATE_05_10=0.001, KAM_BADIPHTTP=2, NORMAL_HTTP_TO_IP=0.001, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id 8kPaHdZV6PzW for ; Tue, 20 Sep 2016 08:52:53 +0000 (UTC) Received: from mail-vk0-f49.google.com (mail-vk0-f49.google.com [209.85.213.49]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id 8D8DD5F177 for ; Tue, 20 Sep 2016 08:52:53 +0000 (UTC) Received: by mail-vk0-f49.google.com with SMTP id u196so15794460vkd.0 for ; Tue, 20 Sep 2016 01:52:53 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :cc; bh=poJvgYVkMk9J7QvUzcQdKT+G/VGe4LcJvefGgD94N+I=; b=AVFGvGoERWa1Lxb9gFAM1JOTzSjd2SdztJtS7SCxRBuh/PGL4mdMnYY9nendJo8H6C OmceBO7//BcjXqy5RoL+caQy0rLBPH0VcSR8hd9o+dGTlrXgumh+ElQgw101U7/VSlrn LJIEkJMeIvQC/clGagvG0+G5/kibwDFhFRmIOGEBGpLBBQII5TBEO/5GPYYIpVGMAydn 44TjkPblNc8toCckJx1z0SMBwywE8uFnSvqfa7h1+9TCSaeOh0nX09KGfo0XexPh+FB9 3LSHNV1N5LdXP9HaP4W7rqj23TcPdAZb93HESvZmKgFzD7amdg3+FZd8qTQoN+WgZDBx AY0g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:cc; bh=poJvgYVkMk9J7QvUzcQdKT+G/VGe4LcJvefGgD94N+I=; b=HTugH0HMzNCzf4PnEWFcB8z8BIhu6XCLwAK70P+1y0BEvhnt6+OAfHwlZGY5kSobkh +Sbed+cbE883F4DmEJV6jjg5G0PZJapJBw+3vFa8Du9gu6nqIwhUdN+8wdOnfsxw0Bml q347a8D10JSZFTMx13AnEDtdRnlDQ2lkWqN4sk5PiayptRzfv63JUYtEZwFmN7VZp/rM 3sNZttTtmy0yhLC9qFDZzagkn9i9tEHa7oXwQE9kHrZKxBjQX8kmPNMkAzsABg/clcgm 9JiGreNW5wIYzKT2e+yWbarJnTdgIeUwfkTCtKjPCBjdH9WB1t5D/bdyRJ7T3wXVW/6j o/qw== X-Gm-Message-State: AE9vXwMvQFUeT3Ht3t4imJNY0m1QX4/SybBx6QOK9ynTkGVboQ2Gdjpg0LgbHKbFdWHSluC+mWXCv0JHi93+3Q== X-Received: by 10.31.16.30 with SMTP id g30mr2630018vki.110.1474361572663; Tue, 20 Sep 2016 01:52:52 -0700 (PDT) MIME-Version: 1.0 Received: by 10.103.75.5 with HTTP; Tue, 20 Sep 2016 01:52:52 -0700 (PDT) In-Reply-To: References: From: kevin Date: Tue, 20 Sep 2016 16:52:52 +0800 Message-ID: Subject: Re: hdfs2.7.3 kerberos can not startup To: Rakesh Radhakrishnan Cc: "user.hadoop" Content-Type: multipart/alternative; boundary=001a114320d0cf251f053cec8d9d archived-at: Tue, 20 Sep 2016 08:53:04 -0000 --001a114320d0cf251f053cec8d9d Content-Type: text/plain; charset=UTF-8 thanks, but my issue is name node could *Login successful,but second namenode couldn't. and name node got a **HttpServer.start() threw a non Bind IOException:* hdfs-site.xml: ** * dfs.webhdfs.enabled* * true* ** ** * dfs.block.access.token.enable* * true* ** ** ** * dfs.namenode.kerberos.principal* * hadoop/_HOST@EXAMPLE.COM * ** ** * dfs.namenode.keytab.file* * /etc/hadoop/conf/hdfs.keytab* ** ** * dfs.https.port* * 50470* ** ** * dfs.namenode.https-address* * dmp1.example.com:50470 * ** ** * dfs.namenode.kerberos.internal.spnego.principa* * HTTP/_HOST@EXAMPLE.COM * ** ** * dfs.web.authentication.kerberos.keytab* * /etc/hadoop/conf/hdfs.keytab* ** ** * dfs.http.policy* * HTTPS_ONLY* ** ** * dfs.https.enable* * true* ** ** ** * dfs.namenode.secondary.http-address* * dmp1.example.com:50090 * ** ** * dfs.secondary.namenode.keytab.file* * /etc/hadoop/conf/hdfs.keytab* ** ** * dfs.secondary.namenode.kerberos.principa* * hadoop/_HOST@EXAMPLE.COM * * * ** * dfs.secondary.namenode.kerberos.internal.spnego.principal* * HTTP/_HOST@EXAMPLE.COM * ** ** * dfs.namenode.secondary.https-port* * 50470* ** ** ** * dfs.journalnode.keytab.file* * /etc/hadoop/conf/hdfs.keytab* ** ** * dfs.journalnode.kerberos.principa* * hadoop/_HOST@EXAMPLE.COM * * * ** * dfs.journalnode.kerberos.internal.spnego.principa* * HTTP/_HOST@EXAMPLE.COM * ** ** * dfs.web.authentication.kerberos.keytab* * /etc/hadoop/conf/hdfs.keytab* ** ** ** * dfs.datanode.kerberos.principal* * hadoop/_HOST@EXAMPLE.COM * ** ** * dfs.datanode.keytab.file* * /etc/hadoop/conf/hdfs.keytab* ** ** * dfs.datanode.data.dir.perm* * 700* ** ** ** * dfs.datanode.address* * 0.0.0.0:61004 * ** ** * dfs.datanode.http.address* * 0.0.0.0:61006 * ** ** * dfs.datanode.https.address* * 0.0.0.0:50470 * ** ** * dfs.data.transfer.protection* * integrity* ** ** * dfs.web.authentication.kerberos.principal* * HTTP/_HOST@EXAMPLE.COM * ** ** * dfs.web.authentication.kerberos.keytab* * /etc/hadoop/conf/hdfs.keytab* ** *and [hadoop@dmp1 hadoop-2.7.3]$ klist -ket /etc/hadoop/conf/hdfs.keytab* *Keytab name: FILE:/etc/hadoop/conf/hdfs.keytabKVNO Timestamp Principal---- ------------------- ------------------------------------------------------ 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (arcfour-hmac) 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96) 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM (des3-cbc-sha1) 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM (arcfour-hmac) * 2016-09-20 15:52 GMT+08:00 Rakesh Radhakrishnan : > >>>>>>Caused by: javax.security.auth.login.LoginException: Unable to > obtain password from user > > Could you please check kerberos principal name is specified correctly in > "hdfs-site.xml", which is used to authenticate against Kerberos. > > If keytab file defined in "hdfs-site.xml" and doesn't exists or wrong > path, you will see > this error. So, please verify the path and the keytab filename correctly > configured. > > I hope hadoop discussion thread, https://goo.gl/M6l3vv may help you. > > > >>>>>>>2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: > HttpServer.start() threw a non Bind IOException > java.io.IOException: !JsseListener: java.lang.NullPointerException > > This is probably due to some missing configuration. > Could you please re-check the ssl-server.xml, keystore and truststore > properties: > > ssl.server.keystore.location > ssl.server.keystore.keypassword > ssl.client.truststore.location > ssl.client.truststore.password > > Rakesh > > On Tue, Sep 20, 2016 at 10:53 AM, kevin wrote: > >> *hi,all:* >> *My environment : Centos7.2 hadoop2.7.3 jdk1.8* >> *after I config hdfs with kerberos ,I can't start up with >> sbin/start-dfs.sh* >> >> *::namenode log as below * >> >> *STARTUP_MSG: build = Unknown -r Unknown; compiled by 'root' on >> 2016-09-18T09:05Z* >> *STARTUP_MSG: java = 1.8.0_102* >> *************************************************************/* >> *2016-09-20 00:54:05,822 INFO >> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal >> handlers for [TERM, HUP, INT]* >> *2016-09-20 00:54:05,825 INFO >> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []* >> *2016-09-20 00:54:06,078 INFO >> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from >> hadoop-metrics2.properties* >> *2016-09-20 00:54:06,149 INFO >> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot >> period at 10 second(s).* >> *2016-09-20 00:54:06,149 INFO >> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system >> started* >> *2016-09-20 00:54:06,151 INFO >> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is >> hdfs://dmp1.example.com:9000 * >> *2016-09-20 00:54:06,152 INFO >> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use >> dmp1.example.com:9000 to access this >> namenode/service.* >> *2016-09-20 00:54:06,446 INFO >> org.apache.hadoop.security.UserGroupInformation: Login successful for user >> hadoop/dmp1.example.com@EXAMPLE.COM using >> keytab file /etc/hadoop/conf/hdfs.keytab* >> *2016-09-20 00:54:06,472 INFO org.apache.hadoop.hdfs.DFSUtil: Starting >> web server as: HTTP/dmp1.example.com@EXAMPLE.COM >> * >> *2016-09-20 00:54:06,475 INFO org.apache.hadoop.hdfs.DFSUtil: Starting >> Web-server for hdfs at: https://dmp1.example.com:50470 >> * >> *2016-09-20 00:54:06,517 INFO org.mortbay.log: Logging to >> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via >> org.mortbay.log.Slf4jLog* >> *2016-09-20 00:54:06,533 INFO >> org.apache.hadoop.security.authentication.server.AuthenticationFilter: >> Unable to initialize FileSignerSecretProvider, falling back to use random >> secrets.* >> *2016-09-20 00:54:06,542 INFO org.apache.hadoop.http.HttpRequestLog: Http >> request log for http.requests.namenode is not defined* >> *2016-09-20 00:54:06,546 INFO org.apache.hadoop.http.HttpServer2: Added >> global filter 'safety' >> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)* >> *2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added >> filter static_user_filter >> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to >> context hdfs* >> *2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added >> filter static_user_filter >> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to >> context static* >> *2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added >> filter static_user_filter >> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to >> context logs* >> *2016-09-20 00:54:06,653 INFO org.apache.hadoop.http.HttpServer2: Added >> filter 'org.apache.hadoop.hdfs.web.Au >> thFilter' >> (class=org.apache.hadoop.hdfs.web.AuthFilter)* >> *2016-09-20 00:54:06,654 INFO org.apache.hadoop.http.HttpServer2: >> addJerseyResourcePackage: >> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, >> pathSpec=/webhdfs/v1/** >> *2016-09-20 00:54:06,657 INFO org.apache.hadoop.http.HttpServer2: Adding >> Kerberos (SPNEGO) filter to getDelegationToken* >> *2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: Adding >> Kerberos (SPNEGO) filter to renewDelegationToken* >> *2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: Adding >> Kerberos (SPNEGO) filter to cancelDelegationToken* >> *2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding >> Kerberos (SPNEGO) filter to fsck* >> *2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding >> Kerberos (SPNEGO) filter to imagetransfer* >> *2016-09-20 00:54:06,665 WARN org.mortbay.log: >> java.lang.NullPointerException* >> *2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: >> HttpServer.start() threw a non Bind IOException* >> *java.io.IOException: !JsseListener: java.lang.NullPointerException* >> * at >> org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketConnector.java:516)* >> * at >> org.apache.hadoop.security.ssl.SslSocketConnectorSecure.newServerSocket(SslSocketConnectorSecure.java:47)* >> * at org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73)* >> * at >> org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914)* >> * at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856)* >> * at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.st >> art(NameNodeHttpServer.java:142)* >> * at >> org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:753)* >> * at >> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:639)* >> * at >> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:812)* >> * at >> org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:796)* >> * at >> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1493)* >> * at >> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)* >> >> >> *::second namenode log as below * >> >> >> >> *STARTUP_MSG: build = Unknown -r Unknown; compiled by 'root' on >> 2016-09-18T09:05ZSTARTUP_MSG: java = >> 1.8.0_102************************************************************/2016-09-20 >> 00:54:14,885 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: >> registered UNIX signal handlers for [TERM, HUP, INT]2016-09-20 00:54:15,263 >> FATAL org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Failed to >> start secondary namenodejava.io.IOException: Login failure for hadoop from >> keytab /etc/hadoop/conf/hdfs.keytab: >> javax.security.auth.login.LoginException: Unable to obtain password from >> user at >> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963) >> at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:246) at >> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:217) >> at >> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:192) >> at >> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671)Caused >> by: javax.security.auth.login.LoginException: Unable to obtain password >> from user at >> com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:897) >> at >> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760) >> at >> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:498) at >> javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) at >> javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) at >> javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) at >> javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) at >> java.security.AccessController.doPrivileged(Native Method) at >> javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) at >> javax.security.auth.login.LoginContext.login(LoginContext.java:587) at >> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:954) >> ... 4 more* >> > > --001a114320d0cf251f053cec8d9d Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
thanks, but my=C2=A0issue is name node could =C2=A0Login successful,but second namenode couldn't. and name node got= a=C2=A0HttpServer.start() threw a non Bind= IOException:

hdfs-site.xml:

<property>
=C2=A0 =C2=A0 <name>dfs.webhdfs.enabled</name>
=C2=A0 =C2=A0 <value>= true</value>
&= lt;/property>
<property>
=C2=A0 <name>df= s.block.access.token.enable</name>
=C2=A0 <value>true</value>
</property>

<!-- NameNode security config -->=
<property>
=C2=A0 <name>dfs.namenode.k= erberos.principal</name>
=C2=A0 <value>hadoop/_= HOST@EXAMPLE.COM</value>
</property>
<property>
=C2=A0 <name>dfs.namenode.keytab.file</name>
=C2=A0 <value>/etc/had= oop/conf/hdfs.keytab</value>
</property>
<property>
=C2=A0 <name>dfs.https.port</name>
=C2=A0 <value>50470</value>= ;
</property><= /b>
<property>=
=C2=A0 <name>dfs.= namenode.https-address</name>
=C2=A0 <value>dmp1.example.com:50470</value>
</property>
<property>
=C2=A0 <name>dfs.namenode.kerberos.internal.spneg= o.principa</name>
=C2=A0 <value>HTTP/_HOST@EXAM= PLE.COM</value>
</property>
<property>
=C2=A0 <name>dfs.web.authentication.kerberos.keytab</name>=
=C2=A0 <value>/et= c/hadoop/conf/hdfs.keytab</value>
</property>
<property>
=C2=A0 <name>dfs.http.policy</name>
=C2=A0 <value>HTTPS_ONLY&l= t;/value>
</pr= operty>
<prope= rty>
=C2=A0 <n= ame>dfs.https.enable</name>
=C2=A0 <value>true</value>
</property>
=

<!-- secondary NameNode security config -->
= <property>
=C2=A0 <name>dfs.namenode.secondary.ht= tp-address</name>
=C2=A0 <value>dmp1.exam= ple.com:50090</value>
</property>
<property>
=C2=A0 <name>dfs.secondary.namenode.keytab.file</name>
=C2=A0 <value>/= etc/hadoop/conf/hdfs.keytab</value>
</property>
<property>
=C2=A0 <name>dfs.secondary.namenode.kerberos.prin= cipa</name>
= =C2=A0 <value>hadoop/_HOST@EXAMPL= E.COM</value>
</property>
&l= t;property>
=C2= =A0 <name>dfs.secondary.namenode.kerberos.internal.spnego.principal&l= t;/name>
=C2=A0 &= lt;value>HTTP/_HOST@EXAMPLE.COM&= lt;/value>
</p= roperty>
<prop= erty>
=C2=A0 <= name>dfs.namenode.secondary.https-port</name>
=C2=A0 <value>50470</value><= /b>
</property>

<= div>
<!-- JournalNode security config -->

<property>
=C2=A0 <name>dfs.journalnode.keytab.file&= lt;/name>
=C2=A0 = <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>
= <property>
=C2=A0 <name>dfs.journalnode.kerberos.= principa</name>
<= b>=C2=A0 <value>hadoop/_HOST@EXAM= PLE.COM</value>
</property>
= <property>
=C2= =A0 <name>dfs.journalnode.kerberos.internal.spnego.principa</name&= gt;
=C2=A0 <value= >HTTP/_HOST@EXAMPLE.COM</valu= e>
</property&= gt;
<property>=
=C2=A0 <name>= dfs.web.authentication.kerberos.keytab</name>
=C2=A0 <value>/etc/hadoop/conf/hdfs.k= eytab</value>
= </property>

<!-- DataNode security config= -->
<property= >
=C2=A0 <name= >dfs.datanode.kerberos.principal</name>
=C2=A0 <value>hadoop/_HOST@EXAMPLE.COM</value>
<= span style=3D"font-size:14px"></property>
<property>
=C2=A0 <name>dfs.datanode.keytab.file<= /name>
=C2=A0 <= ;value>/etc/hadoop/conf/hdfs.keytab</value>
<= span style=3D"font-size:14px"></property>
<property>
=C2=A0 <name>dfs.datanode.data.dir.perm&l= t;/name>
=C2=A0 &= lt;value>700</value>
</property>

<!--= datanode SASL-->
<property>
= =C2=A0 <name>dfs.datanode.address</name>
<= span style=3D"font-size:14px">=C2=A0 <value>0.0.0.0:61004</value>
</property>
<property>
=C2=A0 <name>dfs.datanode.http.address</name&g= t;
=C2=A0 <value&= gt;0.0.0.0:61006</value>
</property>
<property>
=C2=A0 <name>dfs.datanode.= https.address</name>
=C2=A0 <value>0.0.0.0:50470</value>
=C2=A0 <value>integrity</value>
</property>

<property>
=C2=A0 =C2=A0 =C2=A0<name>dfs.web.authenticati= on.kerberos.principal</name>
</property>
<property>
=C2=A0 =C2=A0 =C2=A0<name>dfs.web.authentica= tion.kerberos.keytab</name>
=C2=A0 =C2=A0 =C2=A0<value>/etc/hadoop/conf/hdfs.keytab= </value>
</= property>

and=C2= =A0[hadoop@dmp1 hadoop-2.7.3]$ klist -ket /etc/hadoop/conf/hdfs.keytab<= /div>

=
Keytab name: FILE:/et= c/hadoop/conf/hdfs.keytab
KVNO Timestamp =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 Principal
---- ------------------- -------------------= -----------------------------------
=C2=A0 =C2=A02 09/19/2016 16:= 00:41 hdfs/dmp1.example.com= @EXAMPLE.COM (aes256-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 0= 9/19/2016 16:00:41 hdfs/dmp= 1.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96)=C2=A0
=C2= =A0 =C2=A02 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM (des3-cbc-sha1)=C2=A0
= =C2=A0 =C2=A02 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM (arcfour-hmac)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96)= =C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM (aes128-cts-= hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM= (des3-cbc-sha1)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM<= /a> (arcfour-hmac)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 hdfs/= dmp3.example.com@EXAMPLE.CO= M (aes256-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 1= 6:00:41 hdfs/dmp3.example.c= om@EXAMPLE.COM (aes128-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02= 09/19/2016 16:00:41 hdfs/d= mp3.example.com@EXAMPLE.COM (des3-cbc-sha1)=C2=A0
=C2=A0 =C2= =A02 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM (arcfour-hmac)=C2=A0
=C2=A0 = =C2=A02 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-9= 6)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (des3-cbc-= sha1)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM (arcfou= r-hmac)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM (aes2= 56-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTT= P/dmp2.example.com@EXAMPLE.= COM (aes128-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016= 16:00:41 HTTP/dmp2.example= .com@EXAMPLE.COM (des3-cbc-sha1)=C2=A0
=C2=A0 =C2=A02 09/19/2= 016 16:00:41 HTTP/dmp2.exam= ple.com@EXAMPLE.COM (arcfour-hmac)=C2=A0
=C2=A0 =C2=A02 09/19= /2016 16:00:41 HTTP/dmp3.ex= ample.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96)=C2=A0
=C2=A0 = =C2=A02 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (des3-cbc-sha1)=C2=A0
=C2=A0 =C2=A02 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM (arcfour-hmac)=C2=A0=
=C2=A0 =C2=A02 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM (aes256-cts-hmac= -sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM (= aes128-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 20:21:03= hadoop/dmp1.example.com@EX= AMPLE.COM (des3-cbc-sha1)=C2=A0
=C2=A0 =C2=A02 09/19/2016 20:= 21:03 hadoop/dmp1.example.c= om@EXAMPLE.COM (arcfour-hmac)=C2=A0
=C2=A0 =C2=A02 09/19/2016= 20:21:03 hadoop/dmp2.examp= le.com@EXAMPLE.COM (aes256-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2= =A02 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96)=C2=A0
=
=C2=A0 =C2=A02 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (des3-cbc-sha1)=C2=A0<= /div>
=C2=A0 =C2=A02 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM (arcfour-hmac)=C2= =A0
=C2=A0 =C2=A02 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM (aes256-cts-h= mac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM (aes128-cts-hmac-sha1-96)=C2=A0
=C2=A0 =C2=A02 09/19/2016 = 20:21:03 hadoop/dmp3.exampl= e.com@EXAMPLE.COM (arcfour-hmac)=C2=A0

2016-09-20 15:5= 2 GMT+08:00 Rakesh Radhakrishnan <rakeshr@apache.org>:
<= blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1px= #ccc solid;padding-left:1ex">
>&g= t;>>>>Caused by: javax.security.auth.login.LoginException:= Unable to obtain password from user

Co= uld you please check kerberos principal name is specified correctly in
"hdfs-site.xml", which is used to authenticate against Kerb= eros.

If keytab file defined in "hdfs-site.xm= l" and doesn't exists or wrong path, you will see
this e= rror. So, please verify the path and the keytab filename correctly
configured.

I hope hadoop discussion thread,=C2= =A0https://goo.gl/M6l3v= v=C2=A0may help you.

>>>>>>>2016-09-20 00:54:06,665 INFO org.ap= ache.hadoop.http.HttpServer2: HttpServer.start() threw a non Bind IOEx= ception
java.io.IOException: !JsseListener: java.lang.NullPointer= Exception

This is probably due to some miss= ing configuration.=C2=A0
Could you please re-check the ssl-server= .xml, keystore and truststore properties:
=C2=A0
ssl.se= rver.keystore.location
ssl.server.keystore.keypassword
=
ssl.client.truststore.location
ssl.client.truststore.passwor= d

R= akesh
=

On Tue, Sep= 20, 2016 at 10:53 AM, kevin <kiss.kevin119@gmail.com>= wrote:
hi,all:
My=C2=A0environment : Centos7.2 hadoop2.7.3 jdk1.8=
after I config hdfs with kerberos ,I can't start up w= ith sbin/start-dfs.sh

::namenode log as bel= ow =C2=A0

STARTUP_MSG: =C2= =A0 build =3D Unknown -r Unknown; compiled by 'root' on 2016-09-18T= 09:05Z
STARTUP_MSG: =C2=A0 java =3D 1.8.0_102
************************************************************= /
2016-09-20 00:54:05,822 INFO org.apache.hadoop.hdfs.serv= er.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, = INT]
2016-09-20 00:54:05,825 INFO org.apache.hadoop.hdfs.s= erver.namenode.NameNode: createNameNode []
2016-09-20= 00:54:06,078 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: load= ed properties from hadoop-metrics2.properties
2016-09-20 0= 0:54:06,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Sc= heduled snapshot period at 10 second(s).
2016-09-20 00:54:= 06,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNod= e metrics system started
2016-09-20 00:54:06,151 INFO org.= apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://dmp1.example.com:9= 000
2016-09-20 00:54:06,152 INFO org.apache.hadoop.hdf= s.server.namenode.NameNode: Clients are to use dmp1.example.com:9000 to access thi= s namenode/service.
2016-09-20 00:54:06,446 INFO org.apach= e.hadoop.security.UserGroupInformation: Login successful for user hado= op/dmp1.e= xample.com@EXAMPLE.COM using keytab file /etc/hadoop/conf/hdfs.key= tab
2016-09-20 00:54:06,472 INFO org.apache.hadoop.hdfs.DF= SUtil: Starting web server as: HTTP/dmp1.example.com@EXAMPLE.COM
2016-09-20 00:54:06,475 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: https://dmp1.example.com:50470
2= 016-09-20 00:54:06,517 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4= jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
=
2016-09-20 00:54:06,533 INFO org.apache.hadoop.security.authen= tication.server.AuthenticationFilter: Unable to initialize FileSignerS= ecretProvider, falling back to use random secrets.
2016-09= -20 00:54:06,542 INFO org.apache.hadoop.http.HttpRequestLog: Http requ= est log for http.requests.namenode is not defined
2016-09-= 20 00:54:06,546 INFO org.apache.hadoop.http.HttpServer2: Added global = filter 'safety' (class=3Dorg.apache.hadoop.http.HttpServer2$Qu= otingInputFilter)
2016-09-20 00:54:06,548 INFO org.ap= ache.hadoop.http.HttpServer2: Added filter static_user_filter (class= =3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilte= r) to context hdfs
2016-09-20 00:54:06,548 INFO org.apache= .hadoop.http.HttpServer2: Added filter static_user_filter (class=3Dorg= .apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to = context static
2016-09-20 00:54:06,548 INFO org.apache.had= oop.http.HttpServer2: Added filter static_user_filter (class=3Dorg.apa= che.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to cont= ext logs
2016-09-20 00:54:06,653 INFO org.apache.hadoop.ht= tp.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilt= er' (class=3Dorg.apache.hadoop.hdfs.web.AuthFilter)
= 2016-09-20 00:54:06,654 INFO org.apache.hadoop.http.HttpServer2: ad= dJerseyResourcePackage: packageName=3Dorg.apache.hadoop.hdfs.server.na= menode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSp= ec=3D/webhdfs/v1/*
2016-09-20 00:54:06,657 INFO org.apache= .hadoop.http.HttpServer2: Adding Kerberos (SPNEGO) filter to getDelega= tionToken
2016-09-20 00:54:06,658 INFO org.apache.hadoop.h= ttp.HttpServer2: Adding Kerberos (SPNEGO) filter to renewDelegationTok= en
2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.Htt= pServer2: Adding Kerberos (SPNEGO) filter to cancelDelegationToken=
2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding Kerberos (SPNEGO) filter to fsck
2016-09-2= 0 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding Kerbero= s (SPNEGO) filter to imagetransfer
2016-09-20 00:54:06,665= WARN org.mortbay.log: java.lang.NullPointerException
2016= -09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: HttpServe= r.start() threw a non Bind IOException
java.io.IOException= : !JsseListener: java.lang.NullPointerException
at org.mortbay.jetty.security.SslSo= cketConnector.newServerSocket(SslSocketConnector.java:516)
at org.apache.hadoop.sec= urity.ssl.SslSocketConnectorSecure.newServerSocket(SslSocketConne= ctorSecure.java:47)
at org.mortbay.jetty.bio.SocketConnector.open(SocketConnec= tor.java:73)
at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914)
at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856= )
at org.apac= he.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttp= Server.java:142)
at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpS= erver(NameNode.java:753)
at org.apache.hadoop.hdfs.server.namenode.NameNode.in= itialize(NameNode.java:639)
at org.apache.hadoop.hdfs.server.namenode.NameNode= .<init>(NameNode.java:812)
at org.apache.hadoop.hdfs.server.namenode.Nam= eNode.<init>(NameNode.java:796)
at org.apache.hadoop.hdfs.server.namenod= e.NameNode.createNameNode(NameNode.java:1493)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)
=

::second namenode log as below =C2=A0<= br>

STARTUP_MSG: =C2=A0 build =3D U= nknown -r Unknown; compiled by 'root' on 2016-09-18T09:05Z
STARTUP_MSG: =C2=A0 java =3D 1.8.0_102
************************= ************************************/
2016-09-20 00:54:= 14,885 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: = registered UNIX signal handlers for [TERM, HUP, INT]
2016-09-20 0= 0:54:15,263 FATAL org.apache.hadoop.hdfs.server.namenode.SecondaryName= Node: Failed to start secondary namenode
java.io.IOException: Log= in failure for hadoop from keytab /etc/hadoop/conf/hdfs.keytab: javax.secur= ity.auth.login.LoginException: Unable to obtain password from user

at org.a= pache.hadoop.security.UserGroupInformation.loginUserFromKeytab(Us= erGroupInformation.java:963)
at org.apache.hadoop.security.SecurityUtil.login(Securit= yUtil.java:246)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.init= ialize(SecondaryNameNode.java:217)
at org.apache.hadoop.hdfs.server.namenode.Secondar= yNameNode.<init>(SecondaryNameNode.java:192)
at org.apache.hadoop.hdfs.server.<= wbr>namenode.SecondaryNameNode.main(SecondaryNameNode.java:671)
<= div>Caused by: javax.security.auth.login.LoginException: Unable to obt= ain password from user

at com.sun.security.auth.module.Krb5LoginModule.prom= ptForPass(Krb5LoginModule.java:897)
at com.sun.security.auth.module.Krb5LoginModule.a= ttemptAuthentication(Krb5LoginModule.java:760)
at com.sun.security.auth.module.K= rb5LoginModule.login(Krb5LoginModule.java:617)
at sun.reflect.NativeMethodAccessorI= mpl.invoke0(Native Method)
= at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce= ssorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
= at java.lang.reflect.Method.invoke(Method.java:498)
<= span style=3D"white-space:pre-wrap"> at javax.security.auth.login.Lo= ginContext.invoke(LoginContext.java:755)
at javax.security.auth.login.LoginConte= xt.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(= LoginContext.java:682)
= at javax.security.auth.login.LoginContext$4.run(LoginContext.j= ava:680)
at jav= a.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.Logi<= wbr>nContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginCo= ntext.login(LoginContext.java:587)
at org.apache.hadoop.security.UserGroupInformation= .loginUserFromKeytab(UserGroupInformation.java:954)
... 4 more


--001a114320d0cf251f053cec8d9d--