Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1AC80115AA for ; Wed, 16 Jul 2014 12:40:28 +0000 (UTC) Received: (qmail 32521 invoked by uid 500); 16 Jul 2014 12:40:26 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 32452 invoked by uid 500); 16 Jul 2014 12:40:26 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 32437 invoked by uid 99); 16 Jul 2014 12:40:26 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 16 Jul 2014 12:40:26 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.220.180] (HELO mail-vc0-f180.google.com) (209.85.220.180) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 16 Jul 2014 12:40:23 +0000 Received: by mail-vc0-f180.google.com with SMTP id ij19so1461420vcb.39 for ; Wed, 16 Jul 2014 05:39:57 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:content-type; bh=PCna+rMLJ0OR+WJ1oToBM7t+rSUWCtTcIkWugnYN1sc=; b=E+AcsyQKZ7kcdH0Z1xtGdYIgv/HnSgdtFEggMSoVSKTwa4h8LdGyh+KhCEiXsqDT7S fYYvQYkPpAEiGKu0D9vjt+q82MHrG+K5QYG3ZQNFfkamz6GZsmds90/bJs5Fx3Hhytmf XV/wA+53epc8JwRKhw7Huhfc3Vlxnrn+fpWYvwbCxJGo+bZdwADXvhCZGCYxlB0qu/mo 5jIrp5/iOdwh7nfW0uT1Yg5cg0pH9XyUGaUo21jYuyvFzbAVlv7BT5FjuhDKMUTTx4Qj WaLYscJK0zrKAMguniX8Nyv5fx70mIm1RDhlQWt1R3Wgtl24OWatHV47B8fl2N5sVKPr wS7Q== X-Gm-Message-State: ALoCoQmtr8Nx48jNe/JlS3DddaN5qd7ayED1U6rhSTde4SViEWH8lz8UbzXCXWyZne3rwxbQG6W5 MIME-Version: 1.0 X-Received: by 10.220.196.207 with SMTP id eh15mr1397643vcb.78.1405514397403; Wed, 16 Jul 2014 05:39:57 -0700 (PDT) Received: by 10.58.133.2 with HTTP; Wed, 16 Jul 2014 05:39:57 -0700 (PDT) X-Originating-IP: [115.119.117.206] In-Reply-To: <0C0E7CC9-8B30-4019-9878-6A81FE4C47AC@hortonworks.com> References: <1404816406.2518.YahooMailNeo@web141402.mail.bf1.yahoo.com> <0C0E7CC9-8B30-4019-9878-6A81FE4C47AC@hortonworks.com> Date: Wed, 16 Jul 2014 18:09:57 +0530 Message-ID: Subject: Re: Issue while running Hive 0.13 From: Sarath Chandra To: user@hive.apache.org Content-Type: multipart/alternative; boundary=001a1132e8a462648304fe4ed236 X-Virus-Checked: Checked by ClamAV on apache.org --001a1132e8a462648304fe4ed236 Content-Type: text/plain; charset=UTF-8 Thanks Jason. It worked. There was a different version of SLF4J libraries existing in $HADOOP_HOME/lib. Once I synced both the libraries, it started working. On Fri, Jul 11, 2014 at 11:25 PM, Jason Dere wrote: > Looking at that error online, I see > http://slf4j.org/faq.html#compatibility > Maybe try to find what version of the slf libraries you have installed (in > hadoop? hive?), and try updating to later version. > > > > On Jul 10, 2014, at 9:57 PM, Sarath Chandra < > sarathchandra.josyam@algofusiontech.com> wrote: > > I'm using Hadoop 1.0.4. Suspecting some compatibility issues I moved from > Hive 0.13 to Hive 0.12. > But the exceptions related to SL4J still persist. > > Unable to move forward with hive to finalize a critical product design. > Can somebody please help me? > > > On Wed, Jul 9, 2014 at 11:25 AM, Sarath Chandra < > sarathchandra.josyam@algofusiontech.com> wrote: > >> Thanks Deepesh. >> >> To use hive with embedded derby mode, I have put the below configuration >> in hive-site.xml. As suggested on the net, I ran "schematool -dbType derby >> -initSchema" and it created $HIVE_HOME/metastore_db folder. >> >> Then as suggested by you, I ran "hive --service metastore". Strangely I'm >> getting exceptions related to SL4J -- *java.lang.IllegalAccessError: >> tried to access field org.slf4j.impl.StaticLoggerBinder.SINGLETON from >> class org.slf4j.LoggerFactory* >> >> Is there anything more to configure before starting? >> >> *hive-site.xml* >> >> >> javax.jdo.option.ConnectionURL >> jdbc:derby:;databaseName=metastore_db;create=true >> >> >> hive.metastore.warehouse.dir >> /user/hive/warehouse >> >> >> hive.exec.scratchdir >> /tmp/hduser >> >> >> >> >> On Tue, Jul 8, 2014 at 11:20 PM, D K wrote: >> >>> Did you start the Hive Metastore? You can start that by running >>> hive --service metastore >>> >>> >>> >>> On Tue, Jul 8, 2014 at 5:27 AM, Sarath Chandra < >>> sarathchandra.josyam@algofusiontech.com> wrote: >>> >>>> Thanks Santhosh. >>>> So before going to launch hive shell, we need to start hive server is >>>> what I understand. >>>> I tried starting hive server by running ./bin/hiveserver2. It just >>>> prompts "Starting HiveServer2" and keeps waiting. Nothing is happening even >>>> after waiting for several minutes. >>>> >>>> >>>> On Tue, Jul 8, 2014 at 4:16 PM, Santhosh Thomas < >>>> santhosh.thomas@yahoo.com> wrote: >>>> >>>>> how did you start hive? Use hive-server2 >>>>> >>>>> ------------------------------ >>>>> *From:* Sarath Chandra >>>>> *To:* user@hive.apache.org >>>>> *Sent:* Tuesday, July 8, 2014 4:02 PM >>>>> *Subject:* Issue while running Hive 0.13 >>>>> >>>>> Hi, >>>>> >>>>> I'm a newbie to Hive. Facing an issue while installing hive stable >>>>> version (0.13). I downloaded the tar file from the site ( >>>>> apache-hive-0.13.1-bin.tar.gz >>>>> ) >>>>> and followed the instructions given on Hive Wiki >>>>> . >>>>> On running the "hive" command to get to the hive shell, I'm getting the >>>>> below exception. >>>>> >>>>> Request for a help in this regard. What am I missing? Is there any >>>>> further configuration to be done? >>>>> >>>>> Logging initialized using configuration in >>>>> file:/usr/local/hive-0.13.1/conf/hive-log4j.properties >>>>> Exception in thread "main" java.lang.RuntimeException: >>>>> java.lang.RuntimeException: Unable to instantiate >>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient >>>>> at >>>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346) >>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) >>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) >>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>>> at >>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >>>>> at >>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >>>>> at java.lang.reflect.Method.invoke(Method.java:597) >>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156) >>>>> Caused by: java.lang.RuntimeException: Unable to instantiate >>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient >>>>> at >>>>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412) >>>>> at >>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:62) >>>>> at >>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) >>>>> at >>>>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453) >>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465) >>>>> at >>>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340) >>>>> ... 7 more >>>>> Caused by: java.lang.reflect.InvocationTargetException >>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >>>>> Method) >>>>> at >>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) >>>>> at >>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) >>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:513) >>>>> at >>>>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410) >>>>> ... 12 more >>>>> Caused by: javax.jdo.JDOFatalInternalException: Error creating >>>>> transactional connection factory >>>>> NestedThrowables: >>>>> java.lang.reflect.InvocationTargetException >>>>> at >>>>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587) >>>>> at >>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788) >>>>> at >>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333) >>>>> at >>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) >>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>>> at >>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >>>>> at >>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >>>>> at java.lang.reflect.Method.invoke(Method.java:597) >>>>> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) >>>>> at java.security.AccessController.doPrivileged(Native Method) >>>>> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) >>>>> at >>>>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) >>>>> at >>>>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) >>>>> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) >>>>> at >>>>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310) >>>>> at >>>>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339) >>>>> at >>>>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248) >>>>> at >>>>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223) >>>>> at >>>>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) >>>>> at >>>>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) >>>>> at >>>>> org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:58) >>>>> at >>>>> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStore.java:356) >>>>> at >>>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:54) >>>>> at >>>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944) >>>>> at >>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:171) >>>>> ... 17 more >>>>> Caused by: java.lang.reflect.InvocationTargetException >>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >>>>> Method) >>>>> at >>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) >>>>> at >>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) >>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:513) >>>>> at >>>>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631) >>>>> at >>>>> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325) >>>>> at >>>>> org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282) >>>>> at >>>>> org.datanucleus.store.AbstractStoreManager.(AbstractStoreManager.java:240) >>>>> at >>>>> org.datanucleus.store.rdbms.RDBMSStoreManager.(RDBMSStoreManager.java:286) >>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >>>>> Method) >>>>> at >>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) >>>>> at >>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) >>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:513) >>>>> at >>>>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631) >>>>> at >>>>> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) >>>>> at >>>>> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187) >>>>> at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356) >>>>> at >>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775) >>>>> ... 46 more >>>>> Caused by: java.lang.IllegalAccessError: tried to access field >>>>> org.slf4j.impl.StaticLoggerBinder.SINGLETON from class >>>>> org.slf4j.LoggerFactory >>>>> at org.slf4j.LoggerFactory.(LoggerFactory.java:60) >>>>> at com.jolbox.bonecp.BoneCPConfig.(BoneCPConfig.java:62) >>>>> at >>>>> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:59) >>>>> at >>>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238) >>>>> at >>>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131) >>>>> at >>>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.(ConnectionFactoryImpl.java:85) >>>>> ... 64 more >>>>> >>>>> ~Sarath >>>>> >>>>> >>>>> >>>> >>> >> > > > CONFIDENTIALITY NOTICE > NOTICE: This message is intended for the use of the individual or entity > to which it is addressed and may contain information that is confidential, > privileged and exempt from disclosure under applicable law. If the reader > of this message is not the intended recipient, you are hereby notified that > any printing, copying, dissemination, distribution, disclosure or > forwarding of this communication is strictly prohibited. If you have > received this communication in error, please contact the sender immediately > and delete it from your system. Thank You. --001a1132e8a462648304fe4ed236 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Thanks Jason. It worked.
There was a different version= of SLF4J libraries existing in $HADOOP_HOME/lib.
Once I synced b= oth the libraries, it started working.=C2=A0


On Fri, Jul 11, 2014 at 11:25 PM, Jason = Dere <jdere@hortonworks.com> wrote:
Looking at that error online, I see=C2= =A0ht= tp://slf4j.org/faq.html#compatibility
Maybe try to find what versio= n of the slf libraries you have installed (in hadoop? hive?), and try updat= ing to later version.



On = Jul 10, 2014, at 9:57 PM, Sarath Chandra <sarathchandra.josyam@algofus= iontech.com> wrote:

I'm using Hadoop 1.0.4. = Suspecting some compatibility issues I moved from Hive 0.13 to Hive 0.12.But the exceptions related to SL4J still persist.

Unable to move forward with hive to finalize a critical product design. Can= somebody please help me?


On Wed, Jul 9, 2014 at 11:25 AM, Sarath = Chandra <sarathchandra.josyam@algofusiontech.com= > wrote:
Thanks Deepesh.

To use hive with embedded derby mode, I have put the below configuration = in hive-site.xml. As suggested on the net, I ran "schematool -dbType d= erby -initSchema" and it created $HIVE_HOME/metastore_db folder.

Then as suggested by you, I ran "hive --service me= tastore". Strangely I'm getting exceptions related to SL4J -- j= ava.lang.IllegalAccessError: tried to access field org.slf4j.impl.StaticLog= gerBinder.SINGLETON from class org.slf4j.LoggerFactory

Is there anything more to configure before starting?

hive-site.xml
<configuration>=
=C2=A0 <property>
=C2=A0 =C2=A0 <name>java= x.jdo.option.ConnectionURL</name>
=C2=A0 =C2=A0 <value>jdbc:derby:;databaseName=3Dmetastore_db;cre= ate=3Dtrue</value>
=C2=A0 </property>
=C2= =A0 <property>
=C2=A0 =C2=A0 <name>hive.metastore.war= ehouse.dir</name>
=C2=A0 =C2=A0 <value>/user/hive/warehouse</value>
=C2= =A0 </property>
=C2=A0 <property>
=C2=A0 = =C2=A0 <name>hive.exec.scratchdir</name>
=C2=A0 =C2= =A0 <value>/tmp/hduser</value>
=C2=A0 </property>
</configuration>
<= div>


On Tue, Jul 8, 2014 at 11:20 PM, D K <= deepeshk@gmail.com> wrote:
Did you start the Hive Metastore? You can start that by ru= nning
hive --service metastore



On Tue, Jul 8, 2014 at 5:27 AM, Sara= th Chandra <sarathchandra.josyam@algofusiontech.com<= /a>> wrote:
Thanks Santhosh.
So bef= ore going to launch hive shell, we need to start hive server is what I unde= rstand.
I tried starting hive server by running ./bin/hiveserver2. It just pro= mpts "Starting HiveServer2" and keeps waiting. Nothing is happeni= ng even after waiting for several minutes.


On Tue, Jul 8, 2014 at 4:16 PM, Santhosh Thomas <santhosh.thomas@yahoo.com> wrote:
how did you start hive? Use hive-server2


From: Sarath Chandra <sarathchandra.josy= am@algofusiontech.com>
To: user@hive.apache.org
Sent: Tuesday, July 8, 2014 4:02 PM Subject: Issue while runni= ng Hive 0.13

Hi,

I'm a newbie to Hive. Facing an issue while install= ing hive stable version (0.13). I downloaded the tar file from the site (apache-hive-0.13.1-bin.tar.gz) and followed the instructions given on Hive Wiki. On running th= e "hive" command to get to the hive shell, I'm getting the be= low exception.

Request for a help in this regard. What am I missing? I= s there any further configuration to be done?

Logging initialized using configuration in file:/usr/local/hive-0.13.1/con= f/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: java.= lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metasto= re.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav= a:346)
at org.apache.hadoop.hive= .cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.j= ava:625)
at sun.reflect.NativeMeth= odAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMeth= odAccessorImpl.java:39)
at sun.reflect.Delegating= MethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
<= span style=3D"white-space:pre-wrap"> at java.lang.reflect.Method.inv= oke(Method.java:597)
at org.apache.hadoop.util= .RunJar.main(RunJar.java:156)
Caused by: java.lang.RuntimeExcepti= on: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreCli= ent
at org.apache.hadoop.hive= .metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
<= span style=3D"white-space:pre-wrap"> at org.apache.hadoop.hive.metas= tore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)<= /div>
at org.apache.hadoop.hive= .metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72= )
at org.apache.hado= op.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive= .ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.star= t(SessionState.java:340)
... 7 more
Caus= ed by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl= .newInstance0(Native Method)
at sun.reflect.NativeCons= tructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.Delegatin= gConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java= :27)
at java.lang.reflect.Cons= tructor.newInstance(Constructor.java:513)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.ne= wInstance(MetaStoreUtils.java:1410)
... 12 more
Cau= sed by: javax.jdo.JDOFatalInternalException: Error creating transactional c= onnection factory
NestedThrowables:
java.lang.reflect.I= nvocationTargetException
at org.datanucleus.api.jd= o.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java= :587)
at org.datanuc= leus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenc= eManagerFactory.java:788)
at org.datanucleus.api.jd= o.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersisten= ceManagerFactory.java:333)
= at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersisten= ceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMeth= odAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMeth= odAccessorImpl.java:39)
at sun.reflect.Delegating= MethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
<= span style=3D"white-space:pre-wrap"> at java.lang.reflect.Method.inv= oke(Method.java:597)
at javax.jdo.JDOHelper$16= .run(JDOHelper.java:1965)
<= /span>at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.in= voke(JDOHelper.java:1960)
<= /span>at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementa= tion(JDOHelper.java:1166)
at javax.jdo.JDOHelper.ge= tPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFact= ory(JDOHelper.java:701)
at org.apache.hadoop.hive= .metastore.ObjectStore.getPMF(ObjectStore.java:310)
at org.apache.hadoop.hive.metastore.Objec= tStore.getPersistenceManager(ObjectStore.java:339)
at org.apache.hadoop.hive= .metastore.ObjectStore.initialize(ObjectStore.java:248)
at org.apache.hadoop.hive.metastore.Obj= ectStore.setConf(ObjectStore.java:223)
at org.apache.hadoop.util= .ReflectionUtils.setConf(ReflectionUtils.java:62)
at org.apache.hadoop.util.ReflectionUtils.new= Instance(ReflectionUtils.java:117)
at org.apache.hadoop.hive= .metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
at org.apache.hadoop.hive.metastor= e.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive= .metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
at org.apache.hadoop.hi= ve.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
at org.apache.hadoop.hive= .metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)=
at org.apache.hadoo= p.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
at org.apache.hadoop.hive= .metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
at org.apache.hadoop.h= ive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
at org.apache.hadoop.hive= .metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
at org.apache.hadoop.hive.me= tastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
at org.apache.hadoop.hive= .metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
... 17 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAcc= essorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(= NativeConstructorAccessorImpl.java:39)
at sun.reflect.Delegating= ConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:= 27)
at java.lang.ref= lect.Constructor.newInstance(Constructor.java:513)
at org.datanucleus.plugin= .NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistr= y.java:631)
at org.d= atanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.jav= a:325)
at org.datanucleus.store.= AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:28= 2)
at org.datanucleu= s.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)
at org.datanucleus.store.= rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)
= at sun.reflect.NativeConstruct= orAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeCons= tructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.Delegatin= gConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java= :27)
at java.lang.reflect.Cons= tructor.newInstance(Constructor.java:513)
at org.datanucleus.plugin.NonManagedPluginRegistry.cr= eateExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin= .PluginManager.createExecutableExtension(PluginManager.java:301)
= at org.datanucleus.NucleusCont= ext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.Nucleu= sContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFa= ctory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
... 46 more
Cau= sed by: java.lang.IllegalAccessError: tried to access field org.slf4j.impl.= StaticLoggerBinder.SINGLETON from class org.slf4j.LoggerFactory
at org.slf4j.LoggerFactor= y.<clinit>(LoggerFactory.java:60)
at com.jolbox.bonecp.BoneCPConfig.<clinit>(BoneCP= Config.java:62)
at org.datanucleus.store.= rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneC= PConnectionPoolFactory.java:59)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDa= taSources(ConnectionFactoryImpl.java:238)
at org.datanucleus.store.= rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.jav= a:131)
at org.datanu= cleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.= java:85)
... 64 more

~Sarath








CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u.
--001a1132e8a462648304fe4ed236--