hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Purtell <apurt...@apache.org>
Subject Re: [DISCUSS] Publishing hbase binaries with different hadoop release lines
Date Thu, 30 May 2019 17:39:35 GMT
The separate compatibility modules repo idea is appealing, especially if it
can provide a single jar that also shades and includes Hadoop classes and
their specific dependencies. This would simplify version specific
deployment: Unpack HBase tarball, remove included hbase-hadoop-*.jar,
download appropriate hbase-hadoop-*.jar from hbase-hadoop-compat artifact,
put it into lib/, you are good to go. No need to worry about multiple
Hadoop component jars, their dependencies like guava and jackson and

The build and assembly files for such an uber compat module would also
serve as a template for anyone who wants to roll their own for a new Hadoop
version or an API compatible set of alternative jars.

On Thu, May 30, 2019 at 6:33 AM Raymond Lau <rlau@attivio.com> wrote:

> The idea of a separate repo with compatibility modules sounds appealing.
> This refinement has several advantages esp if we separate out the
> "compatibility API" from the actual compatibility module implementation:
> 1. We can choose certain major release lines of Hadoop and provide modules
> for those lines.
> 2. Customers wishing to support other Hadoop release lines can then create
> their own compatibility module. As long as the implement the API interfaces
> (or extend the API abstract classes), and get it to compile/work against
> their release of Hadoop, all is good.  The API can be versioned along with
> hbase as a separate artifact.
> -----Original Message-----
> From: Sean Busbey <busbey@apache.org>
> Sent: Thursday, May 30, 2019 9:06 AM
> To: user@hbase.apache.org
> Cc: dev <dev@hbase.apache.org>
> Subject: Re: [DISCUSS] Publishing hbase binaries with different hadoop
> release lines
> What about moving back to having a per-major-version-of-hadoop
> compatibility module again that builds against one needed major version all
> the time? (Presumably with some shell script magic to pick the right one?)
> that would be preferable imho to e.g. producing main project binary
> tarballs per Hadoop version.
> Or! We could move stuff that relies on brittle Hadoop internals into its
> own repo (or one of our existing repos) and build _that_ with binaries for
> our supported Hadoop versions. Then in the main project we can include the
> appropriate artifact for the version of Hadoop we happen to build with
> (essentially leaving the main repo how it is) and update our "replace the
> version of Hadoop!" note to including replacing this "HBase stuff that's
> closely tied to Hadoop internals" jar as well.
> On Wed, May 29, 2019, 08:41 张铎(Duo Zhang) <palomino219@gmail.com> wrote:

Best regards,

Words like orphans lost among the crosstalk, meaning torn from truth's
decrepit hands
   - A23, Crosstalk

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message