hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@hortonworks.com>
Subject Re: Hadoop 1.0.3 (nutch-1.5.1) throwing errors on AIX 6.1
Date Mon, 27 Aug 2012 01:49:41 GMT
On 24 August 2012 06:49, James F Walton <jfwalton@us.ibm.com> wrote:

> Good point.  I did some reading over there and it looks like even the IBM
> packaging of Hadoop (BigInsights) is geared strictly towards Linux.  Both
> the Enterprise and Basic editions only list support for Red Hat or SuSE
> Enterprise Linux.
>
> So, thanks for those that chimed in.  Off to platform migration planning I
> go.
>


I tried running Hadoop on JRockit a few years back; after being the only
person to file JIRAs related to that JVM I reverted to the sun JDK -that's
the only one that it is tested at scale against before Apache releases.

One thing you could do is try and persuade the IBM Power JVM team to start
using Hadoop as part of their JVM qualification process -then get linked in
with the Jenkins-based build & test process, so that regressions get picked
up sooner rather than later. There's no fundamental reason why Hadoop won't
work on other platforms.



>
>
>
>
> From:        Mike Spreitzer/Watson/IBM@IBMUS
> To:        user@hadoop.apache.org
> Date:        08/24/2012 09:28 AM
> Subject:        Re: Hadoop 1.0.3 (nutch-1.5.1) throwing errors on AIX 6.1
> ------------------------------
>
>
>
> While I am not involved with it, I am aware that IBM has a Hadoop
> distribution of its own; I suspect you can expect better coverage from it
> than from the base distribution.  Here is a pointer: *
> http://www-01.ibm.com/software/data/infosphere/biginsights/*<http://www-01.ibm.com/software/data/infosphere/biginsights/>
>
> Regards,
> Mike
>
>
>
> From:        James F Walton/Southbury/IBM@IBMUS
> To:        user@hadoop.apache.org
> Date:        08/24/2012 09:18 AM
> Subject:        Re: Hadoop 1.0.3 (nutch-1.5.1) throwing errors on AIX 6.1
>  ------------------------------
>
>
>
> I'm not entirely sure it's fair to say it's a bug in the IBM JVM.  It's a
> current implementation difference.  They are still using platform-specific
> authentication modules for Windows, AIX, and Linux.  Even Sun/Oracle Java
> has a specific SolarisLoginModule, which is deprecated but still available.
>
> Essentially, depending on what OS/architecture you are on, one of the
> following will exist:
> com.ibm.security.auth.module.NTLoginModule
> com.ibm.security.auth.module.LinuxLoginModule
> com.ibm.security.auth.module.AIXLoginModule
> com.ibm.security.auth.module.AIX64LoginModule
>
> Basically, instead of two possible outcomes, there are four.
>
>
>
> From:        Steve Loughran <stevel@hortonworks.com>
> To:        user@hadoop.apache.org
> Date:        08/22/2012 03:09 PM
> Subject:        Re: Hadoop 1.0.3 (nutch-1.5.1) throwing errors on AIX 6.1
>  ------------------------------
>
>
>
> This is something you ought to raise with the IBM JVM team, as it does
> appear to be a bug in their JVM.
>
> On 21 August 2012 10:11, James F Walton <*jfwalton@us.ibm.com*<jfwalton@us.ibm.com>>
> wrote:
> Found part of the reason while digging through UserGroupInformation.java
>
> /* Return the OS login module class name */
> private static String getOSLoginModuleName() {
>   if (System.getProperty("java.vendor").contains("IBM")) {
>     return windows ? "com.ibm.security.auth.module.NTLoginModule"
>      : "com.ibm.security.auth.module.LinuxLoginModule";
>   } else {
>     return windows ? "com.sun.security.auth.module.NTLoginModule"
>       : "com.sun.security.auth.module.UnixLoginModule";
>   }
> }
>
>
> So basically, if you use IBM java, then you must be on either Windows or
> Linux.  IBM's java appears to have platform specific LoginModules, there's
> AIXLoginModule for 32-bit java on AIX, and AIX64LoginModule for 64-bit java
> on AIX; however, the IBM Linux module appears not to have any 32-bit vs
> 64-bit differentiation.
>
> So, unless anyone has a means to disable this whole security setup (I'm
> not using a hadoop cluster or anything), or wants to dive headlong into
> making the necessary code changes (which I presume from my cursory scanning
> would include a little more than just the above snippet, like the
> getOSPrincipalClass as well), I guess I'll need to look at moving our
> crawler from AIX to Linux.
>
> My java coding skills are not top notch, though I could probably fix it if
> the necessary updates didn't get too convoluted beyond what I think it
> might require.  I would hope there are not too many other instances where
> AIX incompatibility would rear it's head since prior to the security
> features overhaul it all worked fine.
>
> Thoughts?
>
> James
>
>
>
>
> From:        James F Walton/Southbury/IBM@IBMUS
> To:        *user@hadoop.apache.org* <user@hadoop.apache.org>
> Date:        08/17/2012 02:47 PM
> Subject:        Hadoop 1.0.3 (nutch-1.5.1) throwing errors on AIX 6.1
>  ------------------------------
>
>
>
>
> When implementing Nutch 1.0 awhile back, we had to point our scripts to
> /opt/freeware/bin to allow Nutch's Hadoop code to utilize the more
> Linux-like versions of various system commands (like df) in order for
> Hadoop to function properly on AIX.  After doing that, we've had no issues
> with Nutch's hadoop implementation from 1.0 through 1.4.
>
> Now I'm attempting up migrate my existing, functional configurations from
> my Nutch 1.4 installation to Nutch 1.5.1, which is now using Hadoop 1.0.3.
> Now when I attempt to run a crawl, I'm getting these errors that seem to
> be coming from Hadoop, and seem to want to use a LinuxLoginModule class
> (see below)
>
> Is there some configuration setting, plugin or jar file, etc that is
> missing from the new version to make this all work on AIX again?
>
> 2012-08-02 20:03:21,271 ERROR crawl.Injector - Injector:
> java.lang.RuntimeException: java.io.IOException: failure to login
>      at
> org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:546)
>      at
> org.apache.hadoop.mapred.FileInputFormat.addInputPath(FileInputFormat.java:336)
>      at org.apache.nutch.crawl.Injector.inject(Injector.java:209)
>      at org.apache.nutch.crawl.Injector.run(Injector.java:248)
>      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>      at org.apache.nutch.crawl.Injector.main(Injector.java:238)
> Caused by: java.io.IOException: failure to login
>      at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:490)
>      at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:452)
>      at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1494)
>      at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1395)
>      at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>      at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>      at
> org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:542)
>      ... 5 more
> Caused by: javax.security.auth.login.LoginException: unable to find
> LoginModule class: com.ibm.security.auth.module.LinuxLoginModule
>      at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:834)
>      at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>      at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>      at
> java.security.AccessController.doPrivileged(AccessController.java:284)
>      at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>      at javax.security.auth.login.LoginContext.login(LoginContext.java:599)
>      at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:471)
>      ... 11 more
>
>
>
> James
>
>

Mime
View raw message