hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kumar Ravi <kum...@us.ibm.com>
Subject Re: using hadoop on zLinux (Linux on S390)
Date Tue, 11 Dec 2012 16:04:25 GMT
Emile,

 You need a s390 version of hadoop-core binary. Since s390 is not a 
supported binary yet, you'll need to build it

Hope this helps.

Regards,
Kumar

Kumar Ravi
IBM Linux Technology Center 
IBM Master Inventor

11501 Burnet Road,
Austin, TX 78758

Tel.: (512)286-8179



From:
"Emile Kao" <emilekao@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 09:30 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package 
hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the 
official documentation (
http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no 
talk about compiling the code first.
By the way I am using the binary version I downloaded from the official 
download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 
FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 
jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <kumarr@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <emilekao@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <michael_segel@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would 
suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <emilekao@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, 
actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to 
find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at 
javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
> > >        at
> > 
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 



Mime
View raw message