Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 7F14C184C0 for ; Tue, 9 Jun 2015 02:20:06 +0000 (UTC) Received: (qmail 40750 invoked by uid 500); 9 Jun 2015 02:20:03 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 40685 invoked by uid 500); 9 Jun 2015 02:20:03 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 40675 invoked by uid 99); 9 Jun 2015 02:20:03 -0000 Received: from Unknown (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 09 Jun 2015 02:20:03 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 54EC8CCA1B for ; Tue, 9 Jun 2015 02:20:03 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.901 X-Spam-Level: ** X-Spam-Status: No, score=2.901 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-east.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id ti0LTRAztGts for ; Tue, 9 Jun 2015 02:19:50 +0000 (UTC) Received: from mail-ie0-f194.google.com (mail-ie0-f194.google.com [209.85.223.194]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id 8A28942AD6 for ; Tue, 9 Jun 2015 02:19:50 +0000 (UTC) Received: by ierx19 with SMTP id x19so510214ier.3 for ; Mon, 08 Jun 2015 19:19:50 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=+ZL8IXcEo5D5bJ0wf3sa7Ap3dD+2cPU1JkHrIF4/2+E=; b=qQ9pxNLS8bjDqYYqfmaz0vi9utRrOqSzjL1U9fjrU8xBSU1l1PUwvgaFzKnVPV7HBA EEHf8LCP3QA7C3Cmryfslwuhg1rZZd29Dr7iO+25UinUhTQ9ShmprmVPDNFoYxaBYMyP BRDDc0zKVvyqGzo2cAjNKw5/yv3sFVibhGu6rTHLwChLQ25SaU2649qVg1PB8YTHrloD XzVB1idZEu5kJ9yjS2h6gjebO7iAl/pYhcVEZ2vu8o7BdMeU3UgLPvVucb/7zDAE6XEb WbjEHkmu8s/qSyopSFJMo7Bv9NICkyBCQtckWwMQiH7+k85SdMNcaJpiT8Ep/Z2TQrXy 4Ztw== MIME-Version: 1.0 X-Received: by 10.43.76.195 with SMTP id zf3mr26281388icb.62.1433816390019; Mon, 08 Jun 2015 19:19:50 -0700 (PDT) Received: by 10.107.154.145 with HTTP; Mon, 8 Jun 2015 19:19:49 -0700 (PDT) In-Reply-To: References: Date: Mon, 8 Jun 2015 19:19:49 -0700 Message-ID: Subject: Re: Hive 1.2.0 Unable to start metastore From: James Pirz To: user@hive.apache.org Cc: Slava Markeyev Content-Type: multipart/alternative; boundary=001a113321ee99aff505180c6424 --001a113321ee99aff505180c6424 Content-Type: text/plain; charset=UTF-8 Thanks for sharing the issue. Currently I am using two different environment params to run my sessions: One for Hive and one for Spark (wout conflicting Jars being present at the same time), and this seemed to solve my issues. Although I have seen some issues, specially once I need to restart my metastore server. On Mon, Jun 8, 2015 at 1:11 PM, Slava Markeyev wrote: > Sounds like you ran into this: > https://issues.apache.org/jira/browse/HIVE-9198 > > > On Mon, Jun 8, 2015 at 1:06 PM, James Pirz wrote: > >> Thanks ! >> There was a similar problem: Conflicting Jars, but between Hive and >> Spark. >> My eventual goal is running Spark with Hive's tables, and having Spark's >> libraries on my path as well, there were conflicting Jar files. >> I removed Spark libraries from my PATH and Hive's services (remote >> metastore) just started all well. >> For now I am good, but I am just wondering what is the correct way to fix >> this ? Once I wanna start Spark, I need to include its libraries to the >> PATH, and the conflicts seems inevitable. >> >> >> >> On Mon, Jun 8, 2015 at 12:09 PM, Slava Markeyev < >> slava.markeyev@upsight.com> wrote: >> >>> It sounds like you are running into a jar conflict between the hive >>> packaged derby and hadoop distro packaged derby. Look for derby jars on >>> your system to confirm. >>> >>> In the mean time try adding this to your hive-env.sh or hadoop-env.sh >>> file: >>> >>> export HADOOP_USER_CLASSPATH_FIRST=true >>> >>> On Mon, Jun 8, 2015 at 11:52 AM, James Pirz >>> wrote: >>> >>>> I am trying to run Hive 1.2.0 on Hadoop 2.6.0 (on a cluster, running >>>> CentOS). I am able to start Hive CLI and run queries. But once I try to >>>> start Hive's metastore (I trying to use the builtin derby) using: >>>> >>>> hive --service metastore >>>> >>>> I keep getting Class Not Found Exceptions for >>>> "org.apache.derby.jdbc.EmbeddedDriver" (See below). >>>> >>>> I have exported $HIVE_HOME and added $HIVE_HOME/bin and $HIVE_HOME/lib >>>> to the $PATH, and I see that there is "derby-10.11.1.1.jar" file under >>>> $HIVE_HOME/lib . >>>> >>>> In my hive-site.xml (under $HIVE_HOME/conf) I have: >>>> >>>> >>>> javax.jdo.option.ConnectionDriverName >>>> org.apache.derby.jdbc.EmbeddedDriver >>>> Driver class name for a JDBC metastore >>>> >>>> >>>> >>>> javax.jdo.option.ConnectionURL >>>> jdbc:derby:;databaseName=metastore_db;create=true >>>> JDBC connect string for a JDBC metastore >>>> >>>> >>>> So I am not sure, why it can not find it. >>>> Any suggestion or hint would be highly appreciated. >>>> >>>> >>>> Here is the error: >>>> >>>> javax.jdo.JDOFatalInternalException: Error creating transactional >>>> connection factory >>>> ... >>>> Caused by: java.lang.NoClassDefFoundError: Could not initialize class >>>> org.apache.derby.jdbc.EmbeddedDriver >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) >>>> at >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) >>>> at >>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526) >>>> at java.lang.Class.newInstance(Class.java:379) >>>> at >>>> org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47) >>>> at >>>> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54) >>>> at >>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238) >>>> at >>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131) >>>> at >>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.(ConnectionFactoryImpl.java:85) >>>> >>>> >>> >>> >>> -- >>> >>> Slava Markeyev | Engineering | Upsight >>> >>> Find me on LinkedIn >>> >>> >> >> > > > -- > > Slava Markeyev | Engineering | Upsight > > Find me on LinkedIn > > --001a113321ee99aff505180c6424 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Thanks for sharing the issue.
Currently I am usi= ng two different environment params to run my sessions: One for Hive and on= e for Spark (wout conflicting Jars being present at the same time), and thi= s seemed to solve my issues. Although I have seen some issues, specially on= ce I need to restart my metastore server.
<= br>
On Mon, Jun 8, 2015 at 1:11 PM, Slava Markeye= v <slava.markeyev@upsight.com> wrote:
Sounds like you ran into this: htt= ps://issues.apache.org/jira/browse/HIVE-9198

=

On Mon, Jun 8, 20= 15 at 1:06 PM, James Pirz <james.pirz@gmail.com> wrote:
Thanks !
There was a = similar problem: Conflicting Jars, but between Hive and Spark.=C2=A0
<= div>My eventual goal is running Spark with Hive's tables, and having Sp= ark's libraries on my path as well, there were conflicting Jar files.
I removed Spark libraries from my PATH and Hive's services (re= mote metastore) just started all well.
For now I am good, but= I am just wondering what is the correct way to fix this ? Once I wanna sta= rt Spark, I need to include its libraries to the PATH, and the conflicts se= ems inevitable.=C2=A0



On Mon, Jun 8, 2015= at 12:09 PM, Slava Markeyev <slava.markeyev@upsight.com><= /span> wrote:
It so= unds like you are running into a jar conflict between the hive packaged der= by and hadoop distro packaged derby. Look for derby jars on your system to = confirm.

In the mean time try adding this to your hive-env.sh = or hadoop-env.sh file:

export HADOOP_USER_CLASSPATH_FIRST=3Dtrue

On Mon, Jun= 8, 2015 at 11:52 AM, James Pirz <james.pirz@gmail.com> w= rote:
I am trying to run= Hive 1.2.0 on Hadoop 2.6.0 (on a cluster, running CentOS). I am able to st= art Hive CLI and run queries. But once I try to start Hive's metastore = (I trying to use the builtin derby) using:

hive --servic= e metastore

I keep getting Class Not Found Exc= eptions for "org.apache.derby.jdbc.EmbeddedDriver" (See below).= =C2=A0

I have exported $HIVE_HOME and added $HIVE_HOME/b= in and $HIVE_HOME/lib to the $PATH, and I see that there is "derby-10.= 11.1.1.jar" file under $HIVE_HOME/lib .

In my= hive-site.xml (under $HIVE_HOME/conf) I have:

<property>
=C2=A0 =C2=A0 <name>javax.jdo.option.Con= nectionDriverName</name>
=C2=A0 =C2=A0 <value>org.apa= che.derby.jdbc.EmbeddedDriver</value>
=C2=A0 =C2=A0 <des= cription>Driver class name for a JDBC metastore</description>
=C2=A0 </property>

<prop= erty>
=C2=A0 =C2=A0 <name>javax.jdo.option.ConnectionURL= </name>
=C2=A0 =C2=A0 <value>jdbc:derby:;databaseName= =3Dmetastore_db;create=3Dtrue</value>
=C2=A0 =C2=A0 <des= cription>JDBC connect string for a JDBC metastore</description>
=C2=A0 </property>

So I a= m not sure, why it can not find it.
Any suggestion or hint would = be highly appreciated.


Here is the = error:

javax.jdo.JDOFatalInternalException: Error = creating transactional connection factory
...
= Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.a= pache.derby.jdbc.EmbeddedDriver
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Nativ= e Method)
at sun.ref= lect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImp= l.java:57)
at sun.re= flect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAc= cessorImpl.java:45)
= at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstan= ce(Class.java:379)
a= t org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.= loadDriver(AbstractConnectionPoolFactory.java:47)
at org.datanucleus.store.rdbms.connectionpool= .BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFacto= ry.java:54)
at org.d= atanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(Connection= FactoryImpl.java:238)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSource= s(ConnectionFactoryImpl.java:131)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init= >(ConnectionFactoryImpl.java:85)

<= /div>



--

Slava Markeyev | Engineering | Upsight

= Find me on LinkedIn





--

Slava Markeyev | Engineering | Upsight

= Find me on= Lin= kedIn


--001a113321ee99aff505180c6424--