hdt-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeffrey Zemerick <jzemer...@apache.org>
Subject Re: Plugin organization and build system
Date Thu, 10 Jan 2013 19:28:04 GMT
Sounds fine.

+1 for using Tycho. It's much simpler.


On Thu, Jan 10, 2013 at 2:11 PM, Adam Berry <adamb@apache.org> wrote:
> Hi everyone,
> First, I've dropped the code from Hadoop contrib into our git repo, its on
> its own branch, hadoop-contrib. The reason I put it on a branch is because
> I think that splitting things up a little would be a good idea, and should
> make it a little easier to support multiple versions of Hadoop.
> So, the tools as they stand are just in one plugin. Broadly, the features
> right now can be divided into;
> MapReduce project and class code support (wizards etc)
> Launch support for Hadoop
> HDFS interaction
> So taking a root name space of org.apache.hdt, I suggest something like the
> following for the plugin names
> org.apache.hdt.core
> org.apache.hdt.ui
> org.apache.hdt.debug.core
> org.apache.hdt.debug.ui
> org.apache.hdt.hdfs.core
> org.apache.hdt.hdfs.ui
> org.apache.hdt.help
> These may be a little fluid as we get into the details here, but from 10000
> feet it looks ok.
> Finally, I would also like to suggest Tycho (Maven plugin for doing Eclipse
> build stuff) as our build tool. I've done my fair share of pure Ant PDE
> build stuff over the years, and Tycho is vastly easier, and would make it
> much easier for people to build themselves without having to do a bunch of
> local setup first.
> Thoughts? If everyone thinks these are ok, I'll enter some issues and get
> cracking.
> Cheers,
> Adam

View raw message