hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Allen Wittenauer ...@apache.org>
Subject Re: Hadoop native builds fail on ARM due to -m32
Date Wed, 11 May 2011 00:34:25 GMT

On May 10, 2011, at 5:13 PM, Trevor Robinson wrote:

> Is the native build failing on ARM (where gcc doesn't support -m32) a
> known issue, and is there a workaround or fix pending?

	That's interesting.  I didn't realize there was a gcc that didn't support -m.  This seems
like an odd thing not to support, but whatever. :)

> This closest issue I can find is
> https://issues.apache.org/jira/browse/HADOOP-6258 (Native compilation
> assumes gcc), as well as other issues regarding where and how to
> specify -m32/64. However, there doesn't seem to be a specific issue
> covering build failure on systems using gcc where the gcc target does
> not support -m32/64 (such as ARM).

	I've got a homegrown patch that basically removes a lot of the GNU-ness from the configure
to support Sun's compiler, but I don't think I had to remove -m.... so even that won't help

> I've attached a patch that disables specifying -m$(JVM_DATA_MODEL)
> when $host_cpu starts with "arm". (For instance, host_cpu = armv7l for
> my system.) To any maintainers on this list, please let me know if
> you'd like me to open a new issue and/or attach this patch to an
> issue.

	Yes, please file a JIRA in HADOOP and attach the patch.


View raw message