hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Craig Macdonald (JIRA)" <j...@apache.org>
Subject [jira] Created: (HADOOP-3344) libhdfs: always builds 32bit, even when x86_64 Java used
Date Sat, 03 May 2008 22:38:55 GMT
libhdfs: always builds 32bit, even when x86_64 Java used
--------------------------------------------------------

                 Key: HADOOP-3344
                 URL: https://issues.apache.org/jira/browse/HADOOP-3344
             Project: Hadoop Core
          Issue Type: Bug
          Components: libhdfs
         Environment: x86_64 linux, x86_64 Java installed
            Reporter: Craig Macdonald


The makefile for libhdfs is hard-coded to compile 32bit libraries. It should perhaps compile
dependent on which Java is set.

The relevant lines are:

LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)

$OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64 platform. So while
gcc will try to link against the correct libjvm.so, it will fail because libhdfs is to be
built 32bit (because of -m32)

{noformat}
     [exec] /usr/bin/ld: skipping incompatible /usr/java64/latest/jre/lib/amd64/server/libjvm.so
when searching for -ljvm
     [exec] /usr/bin/ld: cannot find -ljvm
     [exec] collect2: ld returned 1 exit status
     [exec] make: *** [/root/def/hadoop-0.16.3/build/libhdfs/libhdfs.so.1] Error 1
{noformat}

The solution should be to specify -m32 or -m64 depending on the os.arch detected.

There are 3 cases to check:
 * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
 * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
 * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message