hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mikhail Bautin <bautin.mailing.li...@gmail.com>
Subject Re: loading Hadoop native libraries in HBase unit tests
Date Mon, 13 Feb 2012 19:21:37 GMT
Would the following work as a complete solution for any platform? We can
make this conditional on a new Maven profile.

   - Download the sources of the Hadoop version being used
   - Run "ant compile-native"
   - Add the directory of libhadoop.so to java.library.path in the test JVM
   options

Thanks,
--Mikhail

On Mon, Feb 13, 2012 at 11:15 AM, Todd Lipcon <todd@cloudera.com> wrote:

> Also keep in mind it's not just the hadoop version, but also the glibc
> version and host architecture. We'd have to publish built binaries for
> all combinations of architecture*hadoopVersion*glibcVersion
>
> Maybe we should just get a copy of _one_ of these versions on the
> hudson build boxes, and have a new hudson job which runs whichever
> tests depend on the native code there?
>
> -Todd
>
> On Mon, Feb 13, 2012 at 10:52 AM, Roman Shaposhnik <rvs@apache.org> wrote:
> > On Mon, Feb 13, 2012 at 1:58 AM, Mikhail Bautin
> > <bautin.mailing.lists@gmail.com> wrote:
> >> Then how about solving the issue for the most common case (the default
> >> version of Hadoop)? We can import the default version of libhadoop.so
> into
> >> the HBase codebase and load it in tests, as I mentioned. This can be
> >> considered a hack but will definitely increase the test coverage.
> >
> > You're not proposing importing a native binary into a source tree, are
> you?
> > That won't be very reliable at all.
> >
> > We can probably come up with a # of workaroudns here, but at the end
> > of the day, unless you recompiled the native bits here and now, chances
> > are they won't be compatible with the OS you happen to be on.
> >
> > Thanks,
> > Roman.
>
>
>
> --
> Todd Lipcon
> Software Engineer, Cloudera
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message