Return-Path: Delivered-To: apmail-hive-dev-archive@www.apache.org Received: (qmail 71352 invoked from network); 14 Mar 2011 19:07:48 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 14 Mar 2011 19:07:48 -0000 Received: (qmail 37238 invoked by uid 500); 14 Mar 2011 19:07:48 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 37216 invoked by uid 500); 14 Mar 2011 19:07:48 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 37207 invoked by uid 99); 14 Mar 2011 19:07:48 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Mar 2011 19:07:47 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=FREEMAIL_FROM,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of edlinuxguru@gmail.com designates 209.85.161.48 as permitted sender) Received: from [209.85.161.48] (HELO mail-fx0-f48.google.com) (209.85.161.48) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Mar 2011 19:07:42 +0000 Received: by fxm7 with SMTP id 7so4242288fxm.35 for ; Mon, 14 Mar 2011 12:07:21 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:in-reply-to:references:date :message-id:subject:from:to:cc:content-type :content-transfer-encoding; bh=caG0NuWoYGOwVkECYv3X7yZbdKAYu98XcpOooUss7MA=; b=xFfG9zOZPwrWi1UwRtl76iRxIUXnrDZUsANMFncu3IdU/h8JzuFpeCr9KRLNbHfMKi pCH1ycIIzLNR4s7BdZ7bPGYiejgnXKUP7MHNOAP3FCA+8SpTHdCr7Lg5tBIU5dFO/WT1 tlyvudibH13gjSdTTBKNhAxEg0a1WeimywhGs= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type:content-transfer-encoding; b=VhSGeq4Kb1lH4VpN++kXp1GGQvkgazyeF85xmVBM3m+0/8kPo1MqxEOQSGmAasKwX9 rA9LMan8AnWJLBwH7I2utLDGu3V2d/xNVOTxV+0Op+AUmbZ5B18ebeep6Q7hMMD1agw/ JrvWWfZQos1Pcr+Qig4ivOAdPdvPXGEyPUI/0= MIME-Version: 1.0 Received: by 10.223.73.194 with SMTP id r2mr6555735faj.108.1300129639880; Mon, 14 Mar 2011 12:07:19 -0700 (PDT) Received: by 10.223.125.211 with HTTP; Mon, 14 Mar 2011 12:07:19 -0700 (PDT) In-Reply-To: References: Date: Mon, 14 Mar 2011 15:07:19 -0400 Message-ID: Subject: Re: how to load hive table schema programatically? From: Edward Capriolo To: Carl Steinbach Cc: dev@hive.apache.org, Jae Lee Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable On Mon, Mar 14, 2011 at 3:01 PM, Carl Steinbach wrote: > Hi Ed, > I'm pretty sure HiveMetaStoreClient is intended to be a public API. > > On Mon, Mar 14, 2011 at 11:49 AM, Edward Capriolo > wrote: >> >> On Mon, Mar 14, 2011 at 2:44 PM, Jae Lee wrote: >> > Ah... thanks alot... that worked :) >> > >> > Is there any other recommended way to load hive table meta data? I >> > suppose >> > accessing meta-store via HiveMetaStoreClient make it possible to chang= e >> > underlying data storage implementation choice. >> > >> > J >> > >> > On Mon, Mar 14, 2011 at 5:56 PM, Carl Steinbach >> > wrote: >> > >> >> Hi Jae, >> >> >> >> Sounds like your problem is related to HIVE-1435 ( >> >> https://issues.apache.org/jira/browse/HIVE-1435). You need to make su= re >> >> that the Datanucleus ORM layer is getting initialized with the >> >> configuration >> >> property datanucleus.identifierFactory=3Ddatanucleus. Probably the >> >> easiest way >> >> to fix this problem is to make sure that the 0.7.0 version of >> >> hive-default.xml is available on the CLASSPATH and is getting loaded >> >> into >> >> HiveConf. Try dumping the contents of your HiveConf object and make >> >> sure >> >> that the values match those that appear in the 0.7.0 version of >> >> hive-default.xml >> >> >> >> Hope this helps. >> >> >> >> Carl >> >> >> >> >> >> On Mon, Mar 14, 2011 at 10:41 AM, Jae Lee wrote: >> >> >> >>> just a bit more information from my debugging so far >> >>> >> >>> my mysql hive metastore have columns like >> >>> "integer_idx" at "columns" table >> >>> "integer_idx" at "sort_cols" table >> >>> >> >>> those columns looks pretty suspicious in that it is similar to "idx" >> >>> columns >> >>> that HiveMetaSotreClient complains missing. >> >>> >> >>> It looks like expectation of having "idx" column is auto-generated >> >>> (not >> >>> from >> >>> package.jdo document) >> >>> Can anybody tell me whether "integer_idx" column should have been >> >>> "idx" >> >>> column at "columns" table? >> >>> or am I suppose to have custom package.jdo file that specify the ind= ex >> >>> column name to "integer_idx" instead of "idx" column? >> >>> >> >>> J >> >>> >> >>> On Mon, Mar 14, 2011 at 2:27 PM, Jae Lee wrote: >> >>> >> >>> > Hi, >> >>> > >> >>> > I've had this code below working with Hive 0.5 >> >>> > >> >>> > String databaseName =3D "default"; >> >>> > String tableName =3D "foobar"; >> >>> > List hiveTable = =3D >> >>> > new >> >>> > HiveMetaStoreClient(new HiveConf(new Configuration(), >> >>> > SessionState.class)).getSchema(databaseName, tableName); >> >>> > >> >>> > >> >>> > to produce list of FieldSchema for a table foobar in default >> >>> > database >> >>> > >> >>> > I've recently upgraded hive to 0.7, and the same code now generate= s >> >>> > an >> >>> > error messages such as >> >>> > >> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was >> >>> > thrown >> >>> > while adding/validating class(es) : Required columns missing from >> >>> > table >> >>> > "`COLUMNS`" : `IDX`. Perhaps your MetaData is incorrect, or you >> >>> > havent >> >>> > enabled "datanucleus.autoCreateColumns". >> >>> > Required columns missing from table "`COLUMNS`" : `IDX`. Perhaps >> >>> > your >> >>> > MetaData is incorrect, or you havent enabled >> >>> > "datanucleus.autoCreateColumns". >> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: >> >>> > Required >> >>> > columns missing from table "`COLUMNS`" : `IDX`. Perhaps your >> >>> > MetaData is >> >>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns". >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImp= l.java:282) >> >>> > at >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:= 175) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTabl= esValidation(RDBMSStoreManager.java:2711) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTab= lesAndValidate(RDBMSStoreManager.java:2503) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSSt= oreManager.java:2148) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(Abstra= ctSchemaTransaction.java:113) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:986) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:952) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreMan= ager.java:919) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(Ma= ppedStoreManager.java:356) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(Exte= ntHelper.java:48) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreMa= nager.java:1332) >> >>> > =A0at >> >>> > >> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java= :4149) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileC= andidates(JDOQLQueryCompiler.java:411) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionComp= ile(QueryCompiler.java:312) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(= JDOQLQueryCompiler.java:225) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(= JDOQLQuery.java:175) >> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDO= QLQuery.java:245) >> >>> > at >> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499= ) >> >>> > =A0at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.j= ava:775) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.ja= va:709) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1076) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWit= hRetry(HiveMetaStore.java:307) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(= HiveMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema= (HiveMetaStore.java:1785) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveM= etaStoreClient.java:857) >> >>> > at >> >>> > >> >>> >> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreCl= ientTest.java:10) >> >>> > =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> > at >> >>> > >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl= .java:39) >> >>> > =A0at >> >>> > >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce= ssorImpl.java:25) >> >>> > at java.lang.reflect.Method.invoke(Method.java:597) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framewor= kMethod.java:44) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCa= llable.java:15) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkM= ethod.java:41) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMe= thod.java:20) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:73) >> >>> > at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:46) >> >>> > =A0at >> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) >> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) >> >>> > =A0at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:= 173) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220) >> >>> > =A0at org.junit.runner.JUnitCore.run(JUnitCore.java:137) >> >>> > at >> >>> > >> >>> >> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4I= deaTestRunner.java:94) >> >>> > =A0at >> >>> > >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(= JUnitStarter.java:196) >> >>> > at >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:= 65) >> >>> > >> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was >> >>> > thrown >> >>> > while adding/validating class(es) : Required columns missing from >> >>> > table >> >>> > "`SORT_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or you >> >>> > havent >> >>> > enabled "datanucleus.autoCreateColumns". >> >>> > Required columns missing from table "`SORT_COLS`" : `IDX`. Perhaps >> >>> > your >> >>> > MetaData is incorrect, or you havent enabled >> >>> > "datanucleus.autoCreateColumns". >> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: >> >>> > Required >> >>> > columns missing from table "`SORT_COLS`" : `IDX`. Perhaps your >> >>> > MetaData >> >>> is >> >>> > incorrect, or you havent enabled "datanucleus.autoCreateColumns". >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImp= l.java:282) >> >>> > at >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:= 175) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTabl= esValidation(RDBMSStoreManager.java:2711) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTab= lesAndValidate(RDBMSStoreManager.java:2503) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSSt= oreManager.java:2148) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(Abstra= ctSchemaTransaction.java:113) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:986) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:952) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreMan= ager.java:919) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(Ma= ppedStoreManager.java:356) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(Exte= ntHelper.java:48) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreMa= nager.java:1332) >> >>> > =A0at >> >>> > >> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java= :4149) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileC= andidates(JDOQLQueryCompiler.java:411) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionComp= ile(QueryCompiler.java:312) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(= JDOQLQueryCompiler.java:225) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(= JDOQLQuery.java:175) >> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDO= QLQuery.java:245) >> >>> > at >> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499= ) >> >>> > =A0at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.j= ava:775) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.ja= va:709) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1076) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWit= hRetry(HiveMetaStore.java:307) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(= HiveMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema= (HiveMetaStore.java:1785) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveM= etaStoreClient.java:857) >> >>> > at >> >>> > >> >>> >> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreCl= ientTest.java:10) >> >>> > =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> > at >> >>> > >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl= .java:39) >> >>> > =A0at >> >>> > >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce= ssorImpl.java:25) >> >>> > at java.lang.reflect.Method.invoke(Method.java:597) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framewor= kMethod.java:44) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCa= llable.java:15) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkM= ethod.java:41) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMe= thod.java:20) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:73) >> >>> > at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:46) >> >>> > =A0at >> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) >> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) >> >>> > =A0at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:= 173) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220) >> >>> > =A0at org.junit.runner.JUnitCore.run(JUnitCore.java:137) >> >>> > at >> >>> > >> >>> >> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4I= deaTestRunner.java:94) >> >>> > =A0at >> >>> > >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(= JUnitStarter.java:196) >> >>> > at >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:= 65) >> >>> > >> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was >> >>> > thrown >> >>> > while adding/validating class(es) : Expected primary key for table >> >>> > `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing keys >> >>> PRIMARY >> >>> > KEY (`SD_ID`) >> >>> > Expected primary key for table `SORT_COLS` PRIMARY KEY >> >>> > (`SD_ID`,`IDX`) >> >>> not >> >>> > found in existing keys PRIMARY KEY (`SD_ID`) >> >>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: >> >>> Expected >> >>> > primary key for table `SORT_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not >> >>> > found >> >>> in >> >>> > existing keys PRIMARY KEY (`SD_ID`) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(Table= Impl.java:368) >> >>> > at >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:= 180) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTabl= esValidation(RDBMSStoreManager.java:2711) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTab= lesAndValidate(RDBMSStoreManager.java:2503) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSSt= oreManager.java:2148) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(Abstra= ctSchemaTransaction.java:113) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:986) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:952) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreMan= ager.java:919) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(Ma= ppedStoreManager.java:356) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(Exte= ntHelper.java:48) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreMa= nager.java:1332) >> >>> > =A0at >> >>> > >> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java= :4149) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileC= andidates(JDOQLQueryCompiler.java:411) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionComp= ile(QueryCompiler.java:312) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(= JDOQLQueryCompiler.java:225) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(= JDOQLQuery.java:175) >> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDO= QLQuery.java:245) >> >>> > at >> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499= ) >> >>> > =A0at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.j= ava:775) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.ja= va:709) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1076) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWit= hRetry(HiveMetaStore.java:307) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(= HiveMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema= (HiveMetaStore.java:1785) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveM= etaStoreClient.java:857) >> >>> > at >> >>> > >> >>> >> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreCl= ientTest.java:10) >> >>> > =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> > at >> >>> > >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl= .java:39) >> >>> > =A0at >> >>> > >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce= ssorImpl.java:25) >> >>> > at java.lang.reflect.Method.invoke(Method.java:597) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framewor= kMethod.java:44) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCa= llable.java:15) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkM= ethod.java:41) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMe= thod.java:20) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:73) >> >>> > at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:46) >> >>> > =A0at >> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) >> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) >> >>> > =A0at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:= 173) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220) >> >>> > =A0at org.junit.runner.JUnitCore.run(JUnitCore.java:137) >> >>> > at >> >>> > >> >>> >> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4I= deaTestRunner.java:94) >> >>> > =A0at >> >>> > >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(= JUnitStarter.java:196) >> >>> > at >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:= 65) >> >>> > >> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was >> >>> > thrown >> >>> > while adding/validating class(es) : Required columns missing from >> >>> > table >> >>> > "`BUCKETING_COLS`" : `IDX`. Perhaps your MetaData is incorrect, or >> >>> > you >> >>> > havent enabled "datanucleus.autoCreateColumns". >> >>> > Required columns missing from table "`BUCKETING_COLS`" : `IDX`. >> >>> > Perhaps >> >>> > your MetaData is incorrect, or you havent enabled >> >>> > "datanucleus.autoCreateColumns". >> >>> > org.datanucleus.store.rdbms.exceptions.MissingColumnException: >> >>> > Required >> >>> > columns missing from table "`BUCKETING_COLS`" : `IDX`. Perhaps you= r >> >>> MetaData >> >>> > is incorrect, or you havent enabled "datanucleus.autoCreateColumns= ". >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImp= l.java:282) >> >>> > at >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:= 175) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTabl= esValidation(RDBMSStoreManager.java:2711) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTab= lesAndValidate(RDBMSStoreManager.java:2503) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSSt= oreManager.java:2148) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(Abstra= ctSchemaTransaction.java:113) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:986) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:952) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreMan= ager.java:919) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(Ma= ppedStoreManager.java:356) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(Exte= ntHelper.java:48) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreMa= nager.java:1332) >> >>> > =A0at >> >>> > >> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java= :4149) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileC= andidates(JDOQLQueryCompiler.java:411) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionComp= ile(QueryCompiler.java:312) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(= JDOQLQueryCompiler.java:225) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(= JDOQLQuery.java:175) >> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDO= QLQuery.java:245) >> >>> > at >> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499= ) >> >>> > =A0at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.j= ava:775) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.ja= va:709) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1076) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWit= hRetry(HiveMetaStore.java:307) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(= HiveMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema= (HiveMetaStore.java:1785) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveM= etaStoreClient.java:857) >> >>> > at >> >>> > >> >>> >> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreCl= ientTest.java:10) >> >>> > =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> > at >> >>> > >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl= .java:39) >> >>> > =A0at >> >>> > >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce= ssorImpl.java:25) >> >>> > at java.lang.reflect.Method.invoke(Method.java:597) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framewor= kMethod.java:44) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCa= llable.java:15) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkM= ethod.java:41) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMe= thod.java:20) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:73) >> >>> > at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:46) >> >>> > =A0at >> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) >> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) >> >>> > =A0at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:= 173) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220) >> >>> > =A0at org.junit.runner.JUnitCore.run(JUnitCore.java:137) >> >>> > at >> >>> > >> >>> >> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4I= deaTestRunner.java:94) >> >>> > =A0at >> >>> > >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(= JUnitStarter.java:196) >> >>> > at >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:= 65) >> >>> > >> >>> > 11/03/14 14:22:02 ERROR DataNucleus.Datastore: An exception was >> >>> > thrown >> >>> > while adding/validating class(es) : Expected primary key for table >> >>> > `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) not found in existing >> >>> > keys >> >>> > PRIMARY KEY (`SD_ID`) >> >>> > Expected primary key for table `BUCKETING_COLS` PRIMARY KEY >> >>> (`SD_ID`,`IDX`) >> >>> > not found in existing keys PRIMARY KEY (`SD_ID`) >> >>> > org.datanucleus.store.rdbms.exceptions.WrongPrimaryKeyException: >> >>> Expected >> >>> > primary key for table `BUCKETING_COLS` PRIMARY KEY (`SD_ID`,`IDX`) >> >>> > not >> >>> found >> >>> > in existing keys PRIMARY KEY (`SD_ID`) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validatePrimaryKey(Table= Impl.java:368) >> >>> > at >> >>> >> >>> org.datanucleus.store.rdbms.table.TableImpl.validate(TableImpl.java:= 180) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTabl= esValidation(RDBMSStoreManager.java:2711) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTab= lesAndValidate(RDBMSStoreManager.java:2503) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSSt= oreManager.java:2148) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(Abstra= ctSchemaTransaction.java:113) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:986) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreM= anager.java:952) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreMan= ager.java:919) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(Ma= ppedStoreManager.java:356) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(Exte= ntHelper.java:48) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreMa= nager.java:1332) >> >>> > =A0at >> >>> > >> >>> > org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java= :4149) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileC= andidates(JDOQLQueryCompiler.java:411) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionComp= ile(QueryCompiler.java:312) >> >>> > at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(= JDOQLQueryCompiler.java:225) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(= JDOQLQuery.java:175) >> >>> > at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) >> >>> > =A0at >> >>> > >> >>> >> >>> org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDO= QLQuery.java:245) >> >>> > at >> >>> > org.datanucleus.store.query.Query.executeWithArray(Query.java:1499= ) >> >>> > =A0at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.j= ava:775) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.ja= va:709) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1076) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(Hiv= eMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWit= hRetry(HiveMetaStore.java:307) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(= HiveMetaStore.java:1073) >> >>> > at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_schema= (HiveMetaStore.java:1785) >> >>> > =A0at >> >>> > >> >>> >> >>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getSchema(HiveM= etaStoreClient.java:857) >> >>> > at >> >>> > >> >>> >> >>> HiveMetaStoreClientTest.shouldGetSchemaFromMetaStore(HiveMetaStoreCl= ientTest.java:10) >> >>> > =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >>> > at >> >>> > >> >>> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl= .java:39) >> >>> > =A0at >> >>> > >> >>> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce= ssorImpl.java:25) >> >>> > at java.lang.reflect.Method.invoke(Method.java:597) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framewor= kMethod.java:44) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCa= llable.java:15) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkM= ethod.java:41) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMe= thod.java:20) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:73) >> >>> > at >> >>> > >> >>> >> >>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRu= nner.java:46) >> >>> > =A0at >> >>> > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) >> >>> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) >> >>> > =A0at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:= 173) >> >>> > at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores= .java:28) >> >>> > =A0at >> >>> > >> >>> >> >>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.j= ava:31) >> >>> > at org.junit.runners.ParentRunner.run(ParentRunner.java:220) >> >>> > =A0at org.junit.runner.JUnitCore.run(JUnitCore.java:137) >> >>> > at >> >>> > >> >>> >> >>> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4I= deaTestRunner.java:94) >> >>> > =A0at >> >>> > >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(= JUnitStarter.java:196) >> >>> > at >> >>> >> >>> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:= 65) >> >>> > >> >>> > I've debugged through the code and found that those missing column= s >> >>> > were >> >>> > added by datanucleus JDO code. I'm not sure what this means, rest = of >> >>> > the >> >>> > hive applications are working perfectly fine. >> >>> > >> >>> > Is there anything that I should try to figure out what's going on? >> >>> > or >> >>> just >> >>> > generally is it a right way to get a schema from hive table? >> >>> > >> >>> > J >> >>> > >> >>> >> >> >> >> >> > >> >> Correct you should not interface with the Metastore this way because >> it is not stable API you are working with. > > I could go either way on this issue. It is public and their are thrift stub= s. http://hive.apache.org/docs/r0.6.0/api/org/apache/hadoop/hive/metastore/Hiv= eMetaStoreClient.html. I would worry about users using RAW methods such as: add_partition(Partition new_part) Add a partition to the table. Users should try to interface through CLI | HiveServer if possible. Anything below that is "internal" IMHO, but I am guilty for using it directly. Just be warned. It changes typically without any deprecation. Edward