Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id F3AB9D646 for ; Tue, 11 Dec 2012 14:57:25 +0000 (UTC) Received: (qmail 27155 invoked by uid 500); 11 Dec 2012 14:57:20 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 27047 invoked by uid 500); 11 Dec 2012 14:57:20 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 27034 invoked by uid 99); 11 Dec 2012 14:57:20 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 11 Dec 2012 14:57:20 +0000 X-ASF-Spam-Status: No, hits=-2.8 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_HI,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of kumarr@us.ibm.com designates 32.97.110.158 as permitted sender) Received: from [32.97.110.158] (HELO e37.co.us.ibm.com) (32.97.110.158) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 11 Dec 2012 14:57:08 +0000 Received: from /spool/local by e37.co.us.ibm.com with IBM ESMTP SMTP Gateway: Authorized Use Only! Violators will be prosecuted for from ; Tue, 11 Dec 2012 07:56:46 -0700 Received: from d03dlp02.boulder.ibm.com (9.17.202.178) by e37.co.us.ibm.com (192.168.1.137) with IBM ESMTP SMTP Gateway: Authorized Use Only! Violators will be prosecuted; Tue, 11 Dec 2012 07:56:44 -0700 Received: from d03relay01.boulder.ibm.com (d03relay01.boulder.ibm.com [9.17.195.226]) by d03dlp02.boulder.ibm.com (Postfix) with ESMTP id CA8333E40055 for ; Tue, 11 Dec 2012 07:56:39 -0700 (MST) Received: from d03av04.boulder.ibm.com (d03av04.boulder.ibm.com [9.17.195.170]) by d03relay01.boulder.ibm.com (8.13.8/8.13.8/NCO v10.0) with ESMTP id qBBEuWXa261876 for ; Tue, 11 Dec 2012 07:56:32 -0700 Received: from d03av04.boulder.ibm.com (loopback [127.0.0.1]) by d03av04.boulder.ibm.com (8.14.4/8.13.1/NCO v10.0 AVout) with ESMTP id qBBEuVPi014399 for ; Tue, 11 Dec 2012 07:56:31 -0700 Received: from d03nm133.boulder.ibm.com (d03nm133.boulder.ibm.com [9.63.34.21]) by d03av04.boulder.ibm.com (8.14.4/8.13.1/NCO v10.0 AVin) with ESMTP id qBBEuU6K014197 for ; Tue, 11 Dec 2012 07:56:30 -0700 In-Reply-To: <20121211145053.311300@gmx.net> References: <20121211142315.311320@gmx.net> <20121211145053.311300@gmx.net> To: user@hadoop.apache.org MIME-Version: 1.0 Subject: Re: using hadoop on zLinux (Linux on S390) X-KeepSent: 5526D4FF:EC99D758-87257AD1:0051CF30; type=4; name=$KeepSent X-Mailer: Lotus Notes Release 8.5.3 September 15, 2011 Message-ID: From: Kumar Ravi Date: Tue, 11 Dec 2012 08:56:24 -0600 X-MIMETrack: Serialize by Router on D03NM133/03/M/IBM(Release 8.5.3FP2 ZX853FP2HF2|October 8, 2012) at 12/11/2012 07:56:26 AM, Serialize complete at 12/11/2012 07:56:26 AM Content-Type: multipart/alternative; boundary="=_alternative 0052128F86257AD1_=" X-Content-Scanned: Fidelis XPS MAILER x-cbid: 12121114-7408-0000-0000-00000AEE8A34 X-Virus-Checked: Checked by ClamAV on apache.org This is a multipart message in MIME format. --=_alternative 0052128F86257AD1_= Content-Type: text/plain; charset="US-ASCII" Hi Emile, I have a couple of questions for you: 1. What version and vendor of JDK did you use to compile and package hadoop? 2. What version and vendor of JVM are you running? You can type java -version from the console to see this. Thanks, Kumar Kumar Ravi IBM Linux Technology Center From: "Emile Kao" To: user@hadoop.apache.org, Date: 12/11/2012 08:51 AM Subject: Re: using hadoop on zLinux (Linux on S390) No, this is the general available version... -------- Original-Nachricht -------- > Datum: Tue, 11 Dec 2012 08:31:57 -0600 > Von: Michael Segel > An: user@hadoop.apache.org > Betreff: Re: using hadoop on zLinux (Linux on S390) > Well, on the surface.... > > It looks like its either a missing class, or you don't have your class > path set up right. > > I'm assuming you got this version of Hadoop from IBM, so I would suggest > contacting their support and opening up a ticket. > > > On Dec 11, 2012, at 8:23 AM, Emile Kao wrote: > > > Hello community, > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM > S390. > > The java provided is "java-s390x-60" 64Bit. > > While trying to format the namenode I got the following error: > > > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG: > > /************************************************************ > > STARTUP_MSG: Starting NameNode > > STARTUP_MSG: host = xxxxxxxxx > > STARTUP_MSG: args = [-format] > > STARTUP_MSG: version = 1.1.0 > > STARTUP_MSG: build = > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct 4 22:06:49 > UTC 2012 > > ************************************************************/ > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y > > 12/12/11 14:16:34 INFO util.GSet: VM type = 64-bit > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB > > 12/12/11 14:16:34 INFO util.GSet: capacity = 2^21 = 2097152 entries > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152 > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find > JAAS classes:com.ibm.security.auth.LinuxPrincipal > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure > to login > > at > org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501) > > at > org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463) > > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491) > > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:480) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412) > > Caused by: javax.security.auth.login.LoginException: > java.lang.NullPointerException: invalid null Class provided > > at javax.security.auth.Subject.getPrincipals(Subject.java:809) > > at > org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86) > > at > org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:600) > > at > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795) > > at > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209) > > at > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732) > > at > java.security.AccessController.doPrivileged(AccessController.java:284) > > at > javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729) > > at > javax.security.auth.login.LoginContext.login(LoginContext.java:600) > > at > org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482) > > at > org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463) > > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491) > > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:480) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391) > > at > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412) > > > > at > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898) > > at > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209) > > at > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732) > > at > java.security.AccessController.doPrivileged(AccessController.java:284) > > at > javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729) > > at > javax.security.auth.login.LoginContext.login(LoginContext.java:600) > > at > org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482) > > ... 6 more > > > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG: > > /************************************************************ > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx > > ************************************************************/ > > $:/opt/flume_hadoop/hadoop-1.1.0> > > > > Question: > > > > 1)@developer > > Are you aware of this behavior? > > 2)It there a way to overcome this problem with a workaround? > > 3)IS it a security issue? --> I was able to issue ssh on localhost > without error. > > > --=_alternative 0052128F86257AD1_= Content-Type: text/html; charset="US-ASCII" Hi Emile,

 I have a couple of questions for you:

1. What version and vendor of JDK did you use to compile and package hadoop?

2. What version and vendor of JVM are you running? You can type java -version from the console to see this.

Thanks,
Kumar

Kumar Ravi
IBM Linux Technology Center



From: "Emile Kao" <emilekao@gmx.net>
To: user@hadoop.apache.org,
Date: 12/11/2012 08:51 AM
Subject: Re: using hadoop on zLinux (Linux on S390)




No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <michael_segel@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface....
>
> It looks like its either a missing class, or you don't have your class
> path set up right.
>
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket.
>
>
> On Dec 11, 2012, at 8:23 AM, Emile Kao <emilekao@gmx.net> wrote:
>
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> >
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
>
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
> to login
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> >
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> >
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> >
> > Question:
> >
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> >
>


--=_alternative 0052128F86257AD1_=--