hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Robert Chansler (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HADOOP-3344) libhdfs: always builds 32bit, even when x86_64 Java used
Date Tue, 03 Mar 2009 22:58:56 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-3344?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Robert Chansler updated HADOOP-3344:
------------------------------------

    Release Note: Changed build procedure for libhdfs to always make the 32-bit version. Build
instructions are in the Jira item.  (was: Changed build procedure for libhdfs to always make
the 32-bit version. Build instructions are in the Jira item.
To build libhdfs use the following command.

ant compile -Dcompile.c++=true -Dlibhdfs=true 
But make sure you have  "autoconf-2.61" installed 

By using libhdfs=true flag we build libhdfs with other c++components.
and the resulting .so file will be installed in  c++/<os_osarch_jvmdatamodel>/lib directory

This patch addresses all the three scenarios

*32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32 
*64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32 
*64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64 )
    Hadoop Flags: [Incompatible change, Reviewed]  (was: [Reviewed, Incompatible change])

> libhdfs: always builds 32bit, even when x86_64 Java used
> --------------------------------------------------------
>
>                 Key: HADOOP-3344
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3344
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: build, libhdfs
>         Environment: x86_64 linux, x86_64 Java installed
>            Reporter: Craig Macdonald
>            Assignee: Giridharan Kesavan
>             Fix For: 0.20.0
>
>         Attachments: HADOOP-3344-v2.patch, HADOOP-3344.patch, HADOOP-3344.patch, HADOOP-3344.patch,
HADOOP-3344.v0.patch, HADOOP-3344.v1.patch, HADOOP-3344.v3.patch
>
>
> The makefile for libhdfs is hard-coded to compile 32bit libraries. It should perhaps
compile dependent on which Java is set.
> The relevant lines are:
> LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
> CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)
> $OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64 platform. So
while gcc will try to link against the correct libjvm.so, it will fail because libhdfs is
to be built 32bit (because of -m32)
> {noformat}
>      [exec] /usr/bin/ld: skipping incompatible /usr/java64/latest/jre/lib/amd64/server/libjvm.so
when searching for -ljvm
>      [exec] /usr/bin/ld: cannot find -ljvm
>      [exec] collect2: ld returned 1 exit status
>      [exec] make: *** [/root/def/hadoop-0.16.3/build/libhdfs/libhdfs.so.1] Error 1
> {noformat}
> The solution should be to specify -m32 or -m64 depending on the os.arch detected.
> There are 3 cases to check:
>  * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
>  * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
>  * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message