Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 81302200B99 for ; Wed, 21 Sep 2016 04:14:42 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 7F6E0160AC9; Wed, 21 Sep 2016 02:14:42 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 0609A160AC5 for ; Wed, 21 Sep 2016 04:14:38 +0200 (CEST) Received: (qmail 49774 invoked by uid 500); 21 Sep 2016 02:14:36 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 49759 invoked by uid 99); 21 Sep 2016 02:14:36 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 21 Sep 2016 02:14:36 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id C15BD1803EA for ; Wed, 21 Sep 2016 02:14:35 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.3 X-Spam-Level: *** X-Spam-Status: No, score=3.3 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HTML_MESSAGE=2, KAM_BADIPHTTP=2, NORMAL_HTTP_TO_IP=0.001, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=cloudera-com.20150623.gappssmtp.com Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id uW3_ljQL62Pi for ; Wed, 21 Sep 2016 02:14:29 +0000 (UTC) Received: from mail-pa0-f53.google.com (mail-pa0-f53.google.com [209.85.220.53]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id A9CE45F177 for ; Wed, 21 Sep 2016 02:14:28 +0000 (UTC) Received: by mail-pa0-f53.google.com with SMTP id hm5so12870182pac.0 for ; Tue, 20 Sep 2016 19:14:28 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=cloudera-com.20150623.gappssmtp.com; s=20150623; h=mime-version:subject:from:in-reply-to:date:cc:message-id:references :to; bh=OQq/k85h6AdwxW9vTKJgi4LcpwNYhkPFvKE7H7XJMO8=; b=XeK/TQfAY50tgzyb6KyIqYwm4E5NKzKBKRqKlwt0ZRe7JjAwdyF/ZM/24Ej7DVPwfH 2M7rv6oV5aQLTlaDBcKqvVvuhXg7miQS7fepMoVhn6FPsuGT+20Ese9m7+UPY6Xh1MT8 dA67b7Iudyo1sKTc10SbSu0gb9E7pOxrEPU1tPrV6cl+kz0xpgUr+nx4eJLY/DSpINP6 6FcRKG8g2SQNtMY5kBTR3g7OeDj006yXQI368WOIn1+wTOy7l6phovuEP11zgBtBFdvv dxOa1SsVsOjyeGWoHOolDF40pGTIJcfWtxX1HcUN07bIM0gce7nrVFnDVoYWbgMZQbzG IwSA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:subject:from:in-reply-to:date:cc :message-id:references:to; bh=OQq/k85h6AdwxW9vTKJgi4LcpwNYhkPFvKE7H7XJMO8=; b=nAamvLMFIW4u+FFjV/b14Irw2s1vHhkAHBd1aNt5SGN/5apZeta4A1T+YA6fZxQkln s3ZnpnXhKG4DbHWrwwxMEW8ZsGx3J9vV3clirF6wAa/tvgc6VnSA7riFzDtrAwwYEuLJ vjP3mzw2RCTXCPiuQIbtHaVxO5bfBaHs7fUOUSQGVrHdfiBzGd+ds4de6gt7C2Ok3mOi MXG6rcI67H99L9KlwdRmMfZFF2iJ4eNKIGqAZRAztdFj+hhi4zffWHJvSvQu6o+bIgEP c7etDBmoQNeB5cP4+yYH4tQYFWixrcykGQTrr/5Z69hiPeODwb0b8NZivgLsUePeyw5D MaHg== X-Gm-Message-State: AE9vXwMELVsx/zxT7Dldejk6Xyk1qJqquE95Ytf7EIIEZg1vbuscQ9Avo1DOqz2Gf1wOoutr X-Received: by 10.66.157.103 with SMTP id wl7mr60213346pab.116.1474424067600; Tue, 20 Sep 2016 19:14:27 -0700 (PDT) Received: from [10.239.193.60] ([73.93.155.56]) by smtp.gmail.com with ESMTPSA id 83sm34270324pfv.17.2016.09.20.19.14.25 (version=TLS1 cipher=ECDHE-RSA-AES128-SHA bits=128/128); Tue, 20 Sep 2016 19:14:26 -0700 (PDT) Content-Type: multipart/alternative; boundary="Apple-Mail=_85883A1A-54BB-47F8-8466-3AA5EDED8E5C" Mime-Version: 1.0 (Mac OS X Mail 8.2 \(2104\)) Subject: Re: hdfs2.7.3 kerberos can not startup From: Wei-Chiu Chuang In-Reply-To: Date: Tue, 20 Sep 2016 19:14:23 -0700 Cc: Brahma Reddy Battula , Rakesh Radhakrishnan , "user.hadoop" Message-Id: <265B2D4C-3F23-4552-9A69-53270C8C00B7@cloudera.com> References: <8AD4EE147886274A8B495D6AF407DF698F1664BA@blreml510-mbx.china.huawei.com> To: kevin X-Mailer: Apple Mail (2.2104) archived-at: Wed, 21 Sep 2016 02:14:42 -0000 --Apple-Mail=_85883A1A-54BB-47F8-8466-3AA5EDED8E5C Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=utf-8 You need to run kinit command to authenticate before running hdfs dfs = -ls command. Wei-Chiu Chuang > On Sep 20, 2016, at 6:59 PM, kevin wrote: >=20 > Thank you Brahma Reddy Battula. > It's because of my problerm of the hdfs-site config file and https ca = configuration. > now I can startup namenode and I can see the datanodes from the web. > but When I try hdfs dfs -ls /: >=20 > [hadoop@dmp1 hadoop-2.7.3]$ hdfs dfs -ls / > 16/09/20 07:56:48 WARN ipc.Client: Exception encountered while = connecting to the server : javax.security.sasl.SaslException: GSS = initiate failed [Caused by GSSException: No valid credentials provided = (Mechanism level: Failed to find any Kerberos tgt)] > ls: Failed on local exception: java.io.IOException: = javax.security.sasl.SaslException: GSS initiate failed [Caused by = GSSException: No valid credentials provided (Mechanism level: Failed to = find any Kerberos tgt)]; Host Details : local host is: = "dmp1.example.com/192.168.249.129 = "; destination host is: = "dmp1.example.com":9000;=20 >=20 > current user is hadoop which startup hdfs , and I have add addprinc = hadoop with commond : > kadmin.local -q "addprinc hadoop"=20 >=20 >=20 > 2016-09-20 17:33 GMT+08:00 Brahma Reddy Battula = >: > Seems to be property problem.. it should be principal ( =E2=80=9Cl=E2=80= =9D is missed). >=20 > =20 >=20 > >=20 > dfs.secondary.namenode.kerberos.principa >=20 > hadoop/_HOST@EXAMPLE.COM >=20 > >=20 > =20 >=20 > =20 >=20 > For namenode httpserver start fail, please check rakesh comments.. >=20 > =20 >=20 > This is probably due to some missing configuration.=20 >=20 > Could you please re-check the ssl-server.xml, keystore and truststore = properties: >=20 > =20 >=20 > ssl.server.keystore.location >=20 > ssl.server.keystore.keypassword >=20 > ssl.client.truststore.location >=20 > ssl.client.truststore.password >=20 > =20 >=20 > =20 >=20 > --Brahma Reddy Battula >=20 > =20 >=20 > From: kevin [mailto:kiss.kevin119@gmail.com = ]=20 > Sent: 20 September 2016 16:53 > To: Rakesh Radhakrishnan > Cc: user.hadoop > Subject: Re: hdfs2.7.3 kerberos can not startup >=20 > =20 >=20 > thanks, but my issue is name node could Login successful,but second = namenode couldn't. and name node got a HttpServer.start() threw a non = Bind IOException: >=20 > =20 >=20 > hdfs-site.xml: >=20 > =20 >=20 > >=20 > dfs.webhdfs.enabled >=20 > true >=20 > >=20 > =20 >=20 > >=20 > dfs.block.access.token.enable >=20 > true >=20 > >=20 > =20 >=20 > >=20 > >=20 > dfs.namenode.kerberos.principal >=20 > hadoop/_HOST@EXAMPLE.COM >=20 > >=20 > >=20 > dfs.namenode.keytab.file >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > >=20 > dfs.https.port >=20 > 50470 >=20 > >=20 > >=20 > dfs.namenode.https-address >=20 > dmp1.example.com:50470 = >=20 > >=20 > >=20 > dfs.namenode.kerberos.internal.spnego.principa >=20 > HTTP/_HOST@EXAMPLE.COM >=20 > >=20 > >=20 > dfs.web.authentication.kerberos.keytab >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > >=20 > dfs.http.policy >=20 > HTTPS_ONLY >=20 > >=20 > >=20 > dfs.https.enable >=20 > true >=20 > >=20 > =20 >=20 > =20 >=20 > >=20 > >=20 > dfs.namenode.secondary.http-address >=20 > dmp1.example.com:50090 = >=20 > >=20 > >=20 > dfs.secondary.namenode.keytab.file >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > >=20 > dfs.secondary.namenode.kerberos.principa >=20 > hadoop/_HOST@EXAMPLE.COM >=20 > =20 >=20 > >=20 > = dfs.secondary.namenode.kerberos.internal.spnego.principal >=20 > HTTP/_HOST@EXAMPLE.COM >=20 > >=20 > >=20 > dfs.namenode.secondary.https-port >=20 > 50470 >=20 > >=20 > =20 >=20 > =20 >=20 > >=20 > =20 >=20 > >=20 > dfs.journalnode.keytab.file >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > >=20 > dfs.journalnode.kerberos.principa >=20 > hadoop/_HOST@EXAMPLE.COM >=20 > =20 >=20 > >=20 > dfs.journalnode.kerberos.internal.spnego.principa >=20 > HTTP/_HOST@EXAMPLE.COM >=20 > >=20 > >=20 > dfs.web.authentication.kerberos.keytab >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > =20 >=20 > =20 >=20 > >=20 > >=20 > dfs.datanode.kerberos.principal >=20 > hadoop/_HOST@EXAMPLE.COM >=20 > >=20 > >=20 > dfs.datanode.keytab.file >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > >=20 > dfs.datanode.data.dir.perm >=20 > 700 >=20 > >=20 > =20 >=20 > >=20 > >=20 > dfs.datanode.address >=20 > 0.0.0.0:61004 >=20 > >=20 > >=20 > dfs.datanode.http.address >=20 > 0.0.0.0:61006 >=20 > >=20 > >=20 > dfs.datanode.https.address >=20 > 0.0.0.0:50470 >=20 > >=20 > =20 >=20 > >=20 > dfs.data.transfer.protection >=20 > integrity >=20 > >=20 > =20 >=20 > >=20 > dfs.web.authentication.kerberos.principal >=20 > HTTP/_HOST@EXAMPLE.COM >=20 > >=20 > >=20 > dfs.web.authentication.kerberos.keytab >=20 > /etc/hadoop/conf/hdfs.keytab >=20 > >=20 > =20 >=20 > and [hadoop@dmp1 hadoop-2.7.3]$ klist -ket = /etc/hadoop/conf/hdfs.keytab >=20 > =20 >=20 > =20 >=20 > Keytab name: FILE:/etc/hadoop/conf/hdfs.keytab >=20 > KVNO Timestamp Principal >=20 > ---- ------------------- = ------------------------------------------------------ >=20 > 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp1.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp2.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 16:00:41 hdfs/dmp3.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp1.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp2.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 16:00:41 HTTP/dmp3.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp1.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp2.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM = (des3-cbc-sha1)=20 >=20 > 2 09/19/2016 20:21:03 hadoop/dmp3.example.com@EXAMPLE.COM = (arcfour-hmac)=20 >=20 > =20 >=20 > 2016-09-20 15:52 GMT+08:00 Rakesh Radhakrishnan >: >=20 > >>>>>>Caused by: javax.security.auth.login.LoginException: Unable to = obtain password from user >=20 > =20 >=20 > Could you please check kerberos principal name is specified correctly = in >=20 > "hdfs-site.xml", which is used to authenticate against Kerberos. >=20 > =20 >=20 > If keytab file defined in "hdfs-site.xml" and doesn't exists or wrong = path, you will see >=20 > this error. So, please verify the path and the keytab filename = correctly >=20 > configured. >=20 > =20 >=20 > I hope hadoop discussion thread, https://goo.gl/M6l3vv = may help you. >=20 > =20 >=20 > =20 >=20 > >>>>>>>2016-09-20 00:54:06,665 INFO = org.apache.hadoop.http.HttpServer2: HttpServer.start() threw a non Bind = IOException >=20 > java.io.IOException: !JsseListener: java.lang.NullPointerException >=20 > =20 >=20 > This is probably due to some missing configuration.=20 >=20 > Could you please re-check the ssl-server.xml, keystore and truststore = properties: >=20 > =20 >=20 > ssl.server.keystore.location >=20 > ssl.server.keystore.keypassword >=20 > ssl.client.truststore.location >=20 > ssl.client.truststore.password >=20 > =20 >=20 > Rakesh >=20 > =20 >=20 > On Tue, Sep 20, 2016 at 10:53 AM, kevin > wrote: >=20 > hi,all: >=20 > My environment : Centos7.2 hadoop2.7.3 jdk1.8 >=20 > after I config hdfs with kerberos ,I can't start up with = sbin/start-dfs.sh >=20 > =20 >=20 > ::namenode log as below =20 >=20 > =20 >=20 > STARTUP_MSG: build =3D Unknown -r Unknown; compiled by 'root' on = 2016-09-18T09:05Z >=20 > STARTUP_MSG: java =3D 1.8.0_102 >=20 > ************************************************************/ >=20 > 2016-09-20 00:54:05,822 INFO = org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal = handlers for [TERM, HUP, INT] >=20 > 2016-09-20 00:54:05,825 INFO = org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode [] >=20 > 2016-09-20 00:54:06,078 INFO = org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from = hadoop-metrics2.properties >=20 > 2016-09-20 00:54:06,149 INFO = org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot = period at 10 second(s). >=20 > 2016-09-20 00:54:06,149 INFO = org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics = system started >=20 > 2016-09-20 00:54:06,151 INFO = org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is = hdfs://dmp1.example.com:9000 > 2016-09-20 00:54:06,152 INFO = org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use = dmp1.example.com:9000 to access this = namenode/service. >=20 > 2016-09-20 00:54:06,446 INFO = org.apache.hadoop.security.UserGroupInformation: Login successful for = user hadoop/dmp1.example.com@EXAMPLE.COM = using keytab file = /etc/hadoop/conf/hdfs.keytab >=20 > 2016-09-20 00:54:06,472 INFO org.apache.hadoop.hdfs.DFSUtil: Starting = web server as: HTTP/dmp1.example.com@EXAMPLE.COM = > 2016-09-20 00:54:06,475 INFO org.apache.hadoop.hdfs.DFSUtil: Starting = Web-server for hdfs at: https://dmp1.example.com:50470 = > 2016-09-20 00:54:06,517 INFO org.mortbay.log: Logging to = org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via = org.mortbay.log.Slf4jLog >=20 > 2016-09-20 00:54:06,533 INFO = org.apache.hadoop.security.authentication.server.AuthenticationFilter: = Unable to initialize FileSignerSecretProvider, falling back to use = random secrets. >=20 > 2016-09-20 00:54:06,542 INFO org.apache.hadoop.http.HttpRequestLog: = Http request log for http.requests.namenode is not defined >=20 > 2016-09-20 00:54:06,546 INFO org.apache.hadoop.http.HttpServer2: Added = global filter 'safety' = (class=3Dorg.apache.hadoop.http.HttpServer2$QuotingInputFilter) >=20 > 2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added = filter static_user_filter = (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) = to context hdfs >=20 > 2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added = filter static_user_filter = (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) = to context static >=20 > 2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added = filter static_user_filter = (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) = to context logs >=20 > 2016-09-20 00:54:06,653 INFO org.apache.hadoop.http.HttpServer2: Added = filter 'org.apache.hadoop.hdfs.web.Au = thFilter' = (class=3Dorg.apache.hadoop.hdfs.web.AuthFilter) >=20 > 2016-09-20 00:54:06,654 INFO org.apache.hadoop.http.HttpServer2: = addJerseyResourcePackage: = packageName=3Dorg.apache.hadoop.hdfs.server.namenode.web.resources;org.apa= che.hadoop.hdfs.web.resources, pathSpec=3D/webhdfs/v1/* >=20 > 2016-09-20 00:54:06,657 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to getDelegationToken >=20 > 2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to renewDelegationToken >=20 > 2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to cancelDelegationToken >=20 > 2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to fsck >=20 > 2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to imagetransfer >=20 > 2016-09-20 00:54:06,665 WARN org.mortbay.log: = java.lang.NullPointerException >=20 > 2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: = HttpServer.start() threw a non Bind IOException >=20 > java.io.IOException: !JsseListener: java.lang.NullPointerException >=20 > at = org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketCon= nector.java:516) >=20 > at = org.apache.hadoop.security.ssl.SslSocketConnectorSecure.newServerSocket(Ss= lSocketConnectorSecure.java:47) >=20 > at org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73) >=20 > at = org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914) >=20 > at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856) >=20 > at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.st = art(NameNodeHttpServer.java:142) >=20 > at = org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.j= ava:753) >=20 > at = org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:6= 39) >=20 > at = org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:812) >=20 > at = org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:796) >=20 > at = org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.ja= va:1493) >=20 > at = org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559) >=20 > =20 >=20 > =20 >=20 > ::second namenode log as below =20 >=20 > =20 >=20 > STARTUP_MSG: build =3D Unknown -r Unknown; compiled by 'root' on = 2016-09-18T09:05Z >=20 > STARTUP_MSG: java =3D 1.8.0_102 >=20 > ************************************************************/ >=20 > 2016-09-20 00:54:14,885 INFO = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: registered = UNIX signal handlers for [TERM, HUP, INT] >=20 > 2016-09-20 00:54:15,263 FATAL = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Failed to = start secondary namenode >=20 > java.io.IOException: Login failure for hadoop from keytab = /etc/hadoop/conf/hdfs.keytab: javax.security.auth.login.LoginException: = Unable to obtain password from user >=20 > =20 >=20 > at = org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGr= oupInformation.java:963) >=20 > at = org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:246) >=20 > at = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(Second= aryNameNode.java:217) >=20 > at = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryN= ameNode.java:192) >=20 > at = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNam= eNode.java:671) >=20 > Caused by: javax.security.auth.login.LoginException: Unable to obtain = password from user >=20 > =20 >=20 > at = com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule= .java:897) >=20 > at = com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5Log= inModule.java:760) >=20 > at = com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:61= 7) >=20 > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >=20 > at = sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:= 62) >=20 > at = sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm= pl.java:43) >=20 > at java.lang.reflect.Method.invoke(Method.java:498) >=20 > at = javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) >=20 > at = javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) >=20 > at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) >=20 > at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) >=20 > at java.security.AccessController.doPrivileged(Native Method) >=20 > at = javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) >=20 > at javax.security.auth.login.LoginContext.login(LoginContext.java:587) >=20 > at = org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGr= oupInformation.java:954) >=20 > ... 4 more >=20 > =20 >=20 > =20 >=20 >=20 --Apple-Mail=_85883A1A-54BB-47F8-8466-3AA5EDED8E5C Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=utf-8 You need to run kinit command to authenticate before running = hdfs dfs -ls command.

Wei-Chiu Chuang

On Sep 20, 2016, at 6:59 PM, kevin <kiss.kevin119@gmail.com> wrote:

Thank you Brahma Reddy Battula.
It's because of = my problerm of the hdfs-site config file and https = ca configuration.
now I can startup = namenode and I can see the datanodes from the web.
but When I try hdfs dfs -ls /:

[hadoop@dmp1 hadoop-2.7.3]$ hdfs dfs -ls /
16/09/20 07:56:48 WARN ipc.Client: Exception = encountered while connecting to the server : = javax.security.sasl.SaslException: GSS initiate failed [Caused by = GSSException: No valid credentials provided (Mechanism level: Failed to = find any Kerberos tgt)]
ls: = Failed on local exception: java.io.IOException: = javax.security.sasl.SaslException: GSS initiate failed [Caused by = GSSException: No valid credentials provided (Mechanism level: Failed to = find any Kerberos tgt)]; Host Details : local host is: "dmp1.example.com/192.168.249.129"; destination host is: = "dmp1.example.com":9000; 

current user is = hadoop which startup hdfs , and I have add addprinc hadoop with commond = :
kadmin.local -q "addprinc hadoop" 


2016-09-20 17:33 GMT+08:00 Brahma = Reddy Battula <brahmareddy.battula@huawei.com>:

Seems to be property problem.. it = should be principal ( =E2=80=9Cl=E2=80=9D is missed).

 

<property>

  = <name>dfs.secondary.namenode.kerberos.principa</name>

  <value>hadoop/_HOST@EXAMPLE.COM</value>

</property>

 

 

For namenode httpserver start fail, = please check rakesh comments..

 

This is probably due to = some missing configuration. 

Could you please re-check the = ssl-server.xml, keystore and truststore properties:

 

ssl.server.keystore.location

ssl.server.keystore.keypassword

ssl.client.truststore.location

ssl.client.truststore.password

 

 

--Brahma Reddy Battula

 

From: kevin [mailto:kiss.kevin119@gmail.com]
Sent: 20 September 2016 16:53
To: Rakesh Radhakrishnan
Cc: user.hadoop
Subject: Re: hdfs2.7.3 kerberos can not startup

 

thanks, but my issue is name node could  Login successful,but second namenode couldn't. and name node = got a HttpServer.start() threw a non Bind IOException:

 

hdfs-site.xml:

 

<property>

    = <name>dfs.webhdfs.enabled</name>

    = <value>true</value>

</property>

 

<property>

  = <name>dfs.block.access.token.enable</name>

  = <value>true</value>

</property>

 

<!-- NameNode security config = -->

<property>

  = <name>dfs.namenode.kerberos.principal</name>

  <value>hadoop/_HOST@EXAMPLE.COM</value>

</property>

<property>

  = <name>dfs.namenode.keytab.file</name>

  = <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

<property>

  = <name>dfs.https.port</name>

  = <value>50470</value>

</property>

<property>

  = <name>dfs.namenode.https-address</name>

  <value>dmp1.example.com:50470</value>

</property>

<property>

  = <name>dfs.namenode.kerberos.internal.spnego.principa</name>

  <value>HTTP/_HOST@EXAMPLE.COM</value>

</property>

<property>

  = <name>dfs.web.authentication.kerberos.keytab</name>

  = <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

<property>

  = <name>dfs.http.policy</name>

  = <value>HTTPS_ONLY</value>

</property>

<property>

  = <name>dfs.https.enable</name>

  = <value>true</value>

</property>

 

 

<!-- secondary NameNode = security config -->

<property>

  = <name>dfs.namenode.secondary.http-address</name>

  <value>dmp1.example.com:50090</value>

</property>

<property>

  = <name>dfs.secondary.namenode.keytab.file</name>

  = <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

<property>

  = <name>dfs.secondary.namenode.kerberos.principa</name>

  <value>hadoop/_HOST@EXAMPLE.COM</value>

</property>     

<property>

  = <name>dfs.secondary.namenode.kerberos.internal.spnego.principal</name>

  <value>HTTP/_HOST@EXAMPLE.COM</value>

</property>

<property>

  = <name>dfs.namenode.secondary.https-port</name>

  = <value>50470</value>

</property>

 

 

<!-- JournalNode security = config -->

 

<property>

  = <name>dfs.journalnode.keytab.file</name>

  = <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

<property>

  = <name>dfs.journalnode.kerberos.principa</name>

  <value>hadoop/_HOST@EXAMPLE.COM</value>

</property>     

<property>

  = <name>dfs.journalnode.kerberos.internal.spnego.principa</name>

  <value>HTTP/_HOST@EXAMPLE.COM</value>

</property>

<property>

  = <name>dfs.web.authentication.kerberos.keytab</name>

  = <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

 

 

<!-- DataNode security config = -->

<property>

  = <name>dfs.datanode.kerberos.principal</name>

  <value>hadoop/_HOST@EXAMPLE.COM</value>

</property>

<property>

  = <name>dfs.datanode.keytab.file</name>

  = <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

<property>

  = <name>dfs.datanode.data.dir.perm</name>

  = <value>700</value>

</property>

 

<!-- datanode = SASL-->

<property>

  = <name>dfs.datanode.address</name>

  <value>0.0.0.0:61004</value>

</property>

<property>

  = <name>dfs.datanode.http.address</name>

  <value>0.0.0.0:61006</value>

</property>

<property>

  = <name>dfs.datanode.https.address</name>

  <value>0.0.0.0:50470</value>

</property>

 

<property>

  = <name>dfs.data.transfer.protection</name>

  = <value>integrity</value>

</property>

 

<property>

    =  <name>dfs.web.authentication.kerberos.principal</name>

    =  <value>HTTP/_HOST@EXAMPLE.COM</value>

</property>

<property>

    =  <name>dfs.web.authentication.kerberos.keytab</name>

    =  <value>/etc/hadoop/conf/hdfs.keytab</value>

</property>

 

and [hadoop@dmp1 hadoop-2.7.3]$ klist -ket = /etc/hadoop/conf/hdfs.keytab

 

 

Keytab name: = FILE:/etc/hadoop/conf/hdfs.keytab

KVNO Timestamp     =       Principal

---- ------------------- = ------------------------------------------------------

   2 09/19/2016 16:00:41 = hdfs/dmp1.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = hdfs/dmp1.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = hdfs/dmp1.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 16:00:41 = hdfs/dmp1.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 16:00:41 = hdfs/dmp2.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = hdfs/dmp2.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = hdfs/dmp2.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 16:00:41 = hdfs/dmp2.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 16:00:41 = hdfs/dmp3.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = hdfs/dmp3.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = hdfs/dmp3.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 16:00:41 = hdfs/dmp3.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 16:00:41 = HTTP/dmp1.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = HTTP/dmp1.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = HTTP/dmp1.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 16:00:41 = HTTP/dmp1.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 16:00:41 = HTTP/dmp2.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = HTTP/dmp2.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = HTTP/dmp2.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 16:00:41 = HTTP/dmp2.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 16:00:41 = HTTP/dmp3.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = HTTP/dmp3.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 16:00:41 = HTTP/dmp3.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 16:00:41 = HTTP/dmp3.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 20:21:03 = hadoop/dmp1.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 20:21:03 = hadoop/dmp1.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 20:21:03 = hadoop/dmp1.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 20:21:03 = hadoop/dmp1.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 20:21:03 = hadoop/dmp2.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 20:21:03 = hadoop/dmp2.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 20:21:03 = hadoop/dmp2.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 20:21:03 = hadoop/dmp2.example.com@EXAMPLE.COM = (arcfour-hmac) 

   2 09/19/2016 20:21:03 = hadoop/dmp3.example.com@EXAMPLE.COM = (aes256-cts-hmac-sha1-96) 

   2 09/19/2016 20:21:03 = hadoop/dmp3.example.com@EXAMPLE.COM = (aes128-cts-hmac-sha1-96) 

   2 09/19/2016 20:21:03 = hadoop/dmp3.example.com@EXAMPLE.COM = (des3-cbc-sha1) 

   2 09/19/2016 20:21:03 = hadoop/dmp3.example.com@EXAMPLE.COM = (arcfour-hmac) 

 

2016-09-20 15:52 GMT+08:00 Rakesh = Radhakrishnan <rakeshr@apache.org>:

>>>>>>Caused = by: javax.security.auth.login.LoginException: Unable to = obtain password from user

 

Could you please check kerberos = principal name is specified correctly in

"hdfs-site.xml", which is used to = authenticate against Kerberos.

 

If keytab file defined in = "hdfs-site.xml" and doesn't exists or wrong path, you will see

this error. So, please verify the = path and the keytab filename correctly

configured.

 

I hope hadoop discussion = thread, https://goo.gl/M6l3vv may help = you.

 

 

>>>>>>>2016-09-20 00:54:06,665 = INFO org.apache.hadoop.http.HttpServer2: = HttpServer.start() threw a non Bind IOException

java.io.IOException: = !JsseListener: java.lang.NullPointerException

 

This is probably due to some = missing configuration. 

Could you please re-check the = ssl-server.xml, keystore and truststore properties:

 

ssl.server.keystore.location

ssl.server.keystore.keypassword

ssl.client.truststore.location

ssl.client.truststore.password

 

Rakesh

 

On Tue, Sep 20, 2016 at 10:53 AM, = kevin <kiss.kevin119@gmail.com> wrote:

hi,all:

My environment = : Centos7.2 hadoop2.7.3 jdk1.8

=

after I config hdfs = with kerberos ,I can't start up with sbin/start-dfs.sh

 

::namenode log as = below  

 

STARTUP_MSG:   = build =3D Unknown -r Unknown; compiled by 'root' on = 2016-09-18T09:05Z

STARTUP_MSG:   = java =3D 1.8.0_102

************************************************************/

2016-09-20 = 00:54:05,822 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, = HUP, INT]

2016-09-20 = 00:54:05,825 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []

2016-09-20 = 00:54:06,078 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from = hadoop-metrics2.properties

2016-09-20 = 00:54:06,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 = second(s).

2016-09-20 = 00:54:06,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started

2016-09-20 = 00:54:06,151 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://dmp1.example.com:9000

2016-09-20 = 00:54:06,152 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use dmp1.example.com:9000 to access this = namenode/service.

2016-09-20 = 00:54:06,446 INFO org.apache.hadoop.security.UserGroupInformation: Login successful for user hadoop/dmp1.example.com@EXAMPLE.COM using keytab = file /etc/hadoop/conf/hdfs.keytab

2016-09-20 = 00:54:06,472 INFO org.apache.hadoop.hdfs.DFSUtil: = Starting web server as: HTTP/dmp1.example.com@EXAMPLE.COM

2016-09-20 = 00:54:06,475 INFO org.apache.hadoop.hdfs.DFSUtil: = Starting Web-server for hdfs at: https://dmp1.example.com:50470

2016-09-20 = 00:54:06,517 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via = org.mortbay.log.Slf4jLog

2016-09-20 = 00:54:06,533 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: = Unable to initialize FileSignerSecretProvider, falling back to use = random secrets.

2016-09-20 = 00:54:06,542 INFO org.apache.hadoop.http.HttpRequestLog: = Http request log for http.requests.namenode is not defined

2016-09-20 = 00:54:06,546 INFO org.apache.hadoop.http.HttpServer2: = Added global filter 'safety' (class=3Dorg.apache.hadoop.http.HttpServer2$QuotingInputFilter)

2016-09-20 = 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: = Added filter static_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to = context hdfs

2016-09-20 = 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: = Added filter static_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to = context static

2016-09-20 = 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: = Added filter static_user_filter (class=3Dorg.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to = context logs

2016-09-20 = 00:54:06,653 INFO org.apache.hadoop.http.HttpServer2: = Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=3Dorg.apache.hadoop.hdfs.web.AuthFilter)

2016-09-20 = 00:54:06,654 INFO org.apache.hadoop.http.HttpServer2: = addJerseyResourcePackage: packageName=3Dorg.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,= pathSpec=3D/webhdfs/v1/*

2016-09-20 = 00:54:06,657 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to getDelegationToken

2016-09-20 = 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to renewDelegationToken

2016-09-20 = 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to cancelDelegationToken

2016-09-20 = 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to fsck

2016-09-20 = 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: = Adding Kerberos (SPNEGO) filter to imagetransfer

2016-09-20 = 00:54:06,665 WARN org.mortbay.log: java.lang.NullPointerException

2016-09-20 = 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: = HttpServer.start() threw a non Bind IOException

java.io.IOException: = !JsseListener: java.lang.NullPointerException

at = org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketConnector.java:516)

at = org.apache.hadoop.security.ssl.SslSocketConnectorSecure.newServerSocket(SslSocketConnectorSecure.java:47)

at = org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73)

at = org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914)

at = org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856)

at = org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)

at = org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:753)

at = org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:639)

at = org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:812)

at = org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:796)

at = org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1493)

at = org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)

=

 

 

::second namenode = log as below  

 

STARTUP_MSG:   = build =3D Unknown -r Unknown; compiled by 'root' on 2016-09-18T09:05Z

STARTUP_MSG:   = java =3D 1.8.0_102

************************************************************/

2016-09-20 = 00:54:14,885 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: registered UNIX signal handlers = for [TERM, HUP, INT]

2016-09-20 = 00:54:15,263 FATAL org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Failed to start secondary = namenode

java.io.IOException: = Login failure for hadoop from keytab /etc/hadoop/conf/hdfs.keytab: = javax.security.auth.login.LoginException: Unable to = obtain password from user

 

at = org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)

at = org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:246)

at = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:217)

at = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:192)

at = org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671)

Caused by: = javax.security.auth.login.LoginException: Unable to = obtain password from user

 

at = com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:897)

at = com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)

at = com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native = Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at = java.lang.reflect.Method.invoke(Method.java:498)

at = javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)

at = javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)

at = javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)

at = javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)

at = java.security.AccessController.doPrivileged(Native Method)

at = javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)

at = javax.security.auth.login.LoginContext.login(LoginContext.java:587)

at = org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:954)

... 4 more

 

 



= --Apple-Mail=_85883A1A-54BB-47F8-8466-3AA5EDED8E5C--