hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Edward Capriolo <>
Subject Re: Hive, datanucleus, jdbc, localmode.
Date Sat, 28 Dec 2013 15:22:13 GMT
You can follow along to what I do here.

Essentially hive requires a HADOOP_HOME because it always wants to fork a
bin/hadoop process. Hive-test helps you unpack hadoop inside target and
change your hadoop_home to some other directory.

It would be nice if there was some other way to do this.

On Fri, Dec 27, 2013 at 10:27 PM, Jay Vyas <> wrote:

> Hi Hive:
> I'm attempting to create a robust eclipse based dev environment for
> testing my hive jobs in localmode however I run into classnotfound errors
> depending on which maven dependencies I use. Also, it seems when I change
> these dependencies from hive 0.12 to hive 0.11, I get other errors related
> to hive trying to launch jobs via calling /usr/bin/hadoop.
> This I am stuck: I can't run hive 12 in local java mode because of subtle
> datanucleus class and API inconsistencies which are tough to resolve, and
> when going to hive 11, it seems local mode is not natively detected via the
> jdbc URL...
> So I have 2 questions:
> 0) how does hive 12 versus 11 implement local mode differently ?
> And
> 1) What is the right way to in hive in pure java/ local environments?
> The hive book suggests modifying configuration properties, for local mode..
> but I also have found  that in hive 0.12 , using the jdbc://hive
> connection URL automagically launches jobs in local mode..
> However in 0.11 , I see calls to /usr/bin/hadoop when running java classes
> in local eclipse environment.
> Thanks!
> FYI to see an example of my pom.xml, you can checkout the
> github://jayunit100/bigpetstore pom.xml file.

View raw message