Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D1D2910CC0 for ; Sun, 20 Oct 2013 05:11:17 +0000 (UTC) Received: (qmail 83612 invoked by uid 500); 20 Oct 2013 05:11:09 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 82977 invoked by uid 500); 20 Oct 2013 05:10:59 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 82969 invoked by uid 99); 20 Oct 2013 05:10:56 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 20 Oct 2013 05:10:56 +0000 X-ASF-Spam-Status: No, hits=3.4 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HK_RANDOM_ENVFROM,HK_RANDOM_FROM,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of zhangxiaoyu912@gmail.com designates 209.85.223.171 as permitted sender) Received: from [209.85.223.171] (HELO mail-ie0-f171.google.com) (209.85.223.171) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 20 Oct 2013 05:10:49 +0000 Received: by mail-ie0-f171.google.com with SMTP id tp5so9367284ieb.16 for ; Sat, 19 Oct 2013 22:10:28 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=qd+8Z1KQ1H0Wdtaqxo097Dcn79xkGvtCajAPnPtAPpQ=; b=epLCwtfTTuOJPuvXzsKbyYF3Eb4mn/2DgKVQu+Z1FxsT5t6SGkKfgJ/jc8aYzF0FmK 3U2nFTwItzaammzkm3CyCOJKULOnbh5iWiPD50MzWa+IdBnLpBFap+a5hmK7I8uPaH6W izZRFC3WTakaaiur0RuFO4A6iSSnAM/R01yLM7Papm1x2cbeOU2hqQBD+hsF45boGUbc Ckpoq420ixDh2A8KlW3gUCG9n6AV5Jk/pp3+y+VWk9Q+JNAqoOqIMK4Ai8KsF7gX9WKn XUGmd2eyD4DR7s41xbRzdst3amYB+pEuXv54yurg16vrIBlZL7uQdMAc9+pBF9bNrDYB PFsg== X-Received: by 10.50.129.39 with SMTP id nt7mr5071463igb.13.1382245828109; Sat, 19 Oct 2013 22:10:28 -0700 (PDT) MIME-Version: 1.0 Received: by 10.42.53.8 with HTTP; Sat, 19 Oct 2013 22:10:07 -0700 (PDT) In-Reply-To: References: From: Zhang Xiaoyu Date: Sat, 19 Oct 2013 22:10:07 -0700 Message-ID: Subject: Re: exception when using Hive 0.12 with MySQL metastore To: user Content-Type: multipart/alternative; boundary=047d7b41432493ac5c04e9252f9b X-Virus-Checked: Checked by ClamAV on apache.org --047d7b41432493ac5c04e9252f9b Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi, Jov, Thanks. I understand turn on those two properties resolve the problems. But I run the hive 0.12 scheme script. I assume it should create all required tables. Johnny On Sat, Oct 19, 2013 at 7:44 PM, Jov wrote: > jov > > On Oct 20, 2013 8:07 AM, "Zhang Xiaoyu" wrote: > > > > Hi, all, > > When I using Hive 0.12 with MySQL metastore. I set those properties in > hive-site.xml. > > datanucleus.autoCreateSchema =3D false > > datanucleus.autoCreateTables=3D false > you should set these properties to true=EF=BC=8Cthen hive will auto add n= ew column. > > > > > In beeline, "show tables" is fine, but create a new table got below > exception, any idea? Since I create the metastore table by the hive 0.12 > scheme script, it shouldn't complain about the missing columns in metasto= re > tables. > > > > Thanks, > > Johnny > > > > ------------------------- > > > > FAILED: Error in metadata: > MetaException(message:javax.jdo.JDODataStoreException: Add request failed= : > INSERT INTO `COLUMNS_V2` > (`CD_ID`,`FCOMMENT`,`COLUMN_NAME`,`TYPE_NAME`,`INTEGER_IDX`) VALUES > (?,?,?,?,?) > > at > org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExcepti= on(NucleusJDOHelper.java:422) > > at > org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersis= tenceManager.java:745) > > at > org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersisten= ceManager.java:765) > > at > org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java= :638) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore= .java:111) > > at sun.proxy.$Proxy6.createTable(Unknown Source) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_co= re(HiveMetaStore.java:1081) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_wi= th_environment_context(HiveMetaStore.java:1114) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHan= dler.java:102) > > at sun.proxy.$Proxy8.create_table_with_environment_context(Unknown > Source) > > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMeta= StoreClient.java:464) > > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMeta= StoreClient.java:453) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingM= etaStoreClient.java:74) > > at sun.proxy.$Proxy10.createTable(Unknown Source) > > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593) > > at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3784= ) > > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:256) > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144) > > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:5= 7) > > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355) > > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139) > > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945) > > at > org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:= 95) > > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(Hive= SessionImpl.java:193) > > at > org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:1= 48) > > at > org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thri= ftCLIService.java:203) > > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement= .getResult(TCLIService.java:1133) > > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement= .getResult(TCLIService.java:1118) > > at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > > at > org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContaining= Processor.java:40) > > at > org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContaining= Processor.java:37) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1232) > > at > org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.jav= a:526) > > at > org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContaini= ngProcessor.java:37) > > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolS= erver.java:206) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java= :1145) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav= a:615) > > at java.lang.Thread.run(Thread.java:722) > > NestedThrowablesStackTrace: > > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown > column 'FCOMMENT' in 'field list' > this column is new in 0.12,will be auto added if you set those settings t= o > true > > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method= ) > > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAc= cessorImpl.java:57) > > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConst= ructorAccessorImpl.java:45) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:525) > > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) > > at com.mysql.jdbc.Util.getInstance(Util.java:386) > > at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052) > > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597) > > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529) > > at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990) > > at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151) > > at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625) > > at > com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2= 119) > > at > com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:241= 5) > > at > com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:233= 3) > > at > com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:231= 8) > > at > org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(Delegat= ingPreparedStatement.java:105) > > at > org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(Delegat= ingPreparedStatement.java:105) > > at > org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLContr= oller.java:407) > > at > org.datanucleus.store.rdbms.scostore.RDBMSJoinListStore.internalAdd(RDBMS= JoinListStore.java:313) > > at > org.datanucleus.store.rdbms.scostore.AbstractListStore.addAll(AbstractLis= tStore.java:136) > > at > org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(Collect= ionMapping.java:134) > > at > org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.j= ava:517) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPers= istenceHandler.java:163) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPer= sistenceHandler.java:139) > > at > org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateMana= ger.java:2371) > > at > org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java= :2347) > > at > org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl= .java:1798) > > at > org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl= .java:1886) > > at > org.datanucleus.store.mapped.mapping.PersistableMapping.setObjectAsValue(= PersistableMapping.java:665) > > at > org.datanucleus.store.mapped.mapping.PersistableMapping.setObject(Persist= ableMapping.java:425) > > at > org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField= (ParameterSetter.java:197) > > at > org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractSt= ateManager.java:1452) > > at > org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvideField= (MStorageDescriptor.java) > > at > org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvideField= s(MStorageDescriptor.java) > > at > org.datanucleus.state.AbstractStateManager.provideFields(AbstractStateMan= ager.java:1520) > > at > org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.j= ava:288) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPers= istenceHandler.java:163) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPer= sistenceHandler.java:139) > > at > org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateMana= ger.java:2371) > > at > org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java= :2347) > > at > org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl= .java:1798) > > at > org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl= .java:1886) > > at > org.datanucleus.store.mapped.mapping.PersistableMapping.setObjectAsValue(= PersistableMapping.java:665) > > at > org.datanucleus.store.mapped.mapping.PersistableMapping.setObject(Persist= ableMapping.java:425) > > at > org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField= (ParameterSetter.java:197) > > at > org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractSt= ateManager.java:1452) > > at > org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java= ) > > at > org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.jav= a) > > at > org.datanucleus.state.AbstractStateManager.provideFields(AbstractStateMan= ager.java:1520) > > at > org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.j= ava:288) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPers= istenceHandler.java:163) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPer= sistenceHandler.java:139) > > at > org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateMana= ger.java:2371) > > at > org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java= :2347) > > at > org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl= .java:1798) > > at > org.datanucleus.ObjectManagerImpl.persistObjectWork(ObjectManagerImpl.jav= a:1647) > > at > org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:15= 04) > > at > org.datanucleus.MultithreadedObjectManager.persistObject(MultithreadedObj= ectManager.java:298) > > at > org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersis= tenceManager.java:740) > > at > org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersisten= ceManager.java:765) > > at > org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java= :638) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore= .java:111) > > at sun.proxy.$Proxy6.createTable(Unknown Source) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_co= re(HiveMetaStore.java:1081) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_wi= th_environment_context(HiveMetaStore.java:1114) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHan= dler.java:102) > > at sun.proxy.$Proxy8.create_table_with_environment_context(Unknown > Source) > > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMeta= StoreClient.java:464) > > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMeta= StoreClient.java:453) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:601) > > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingM= etaStoreClient.java:74) > > at sun.proxy.$Proxy10.createTable(Unknown Source) > > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593) > > at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3784= ) > > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:256) > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144) > > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:5= 7) > > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355) > > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139) > > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945) > > at > org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:= 95) > > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(Hive= SessionImpl.java:193) > > at > org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:1= 48) > > at > org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thri= ftCLIService.java:203) > > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement= .getResult(TCLIService.java:1133) > > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement= .getResult(TCLIService.java:1118) > > at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > > at > org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContaining= Processor.java:40) > > at > org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContaining= Processor.java:37) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1232) > > at > org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.jav= a:526) > > at > org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContaini= ngProcessor.java:37) > > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolS= erver.java:206) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java= :1145) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav= a:615) > > at java.lang.Thread.run(Thread.java:722) > > ) > > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask > > org.apache.hive.service.cli.HiveSQLException: Error while processing > statement: FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask > > at > org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:= 97) > > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(Hive= SessionImpl.java:193) > > at > org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:1= 48) > > at > org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thri= ftCLIService.java:203) > > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement= .getResult(TCLIService.java:1133) > > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement= .getResult(TCLIService.java:1118) > > at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > > at > org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContaining= Processor.java:40) > > at > org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContaining= Processor.java:37) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1232) > > at > org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.jav= a:526) > > at > org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContaini= ngProcessor.java:37) > > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolS= erver.java:206) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java= :1145) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav= a:615) > > at java.lang.Thread.run(Thread.java:722) > --047d7b41432493ac5c04e9252f9b Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi, Jov,
Thanks. I understand turn on those two = properties resolve the problems. But I run the hive 0.12 scheme script. I a= ssume it should create all required tables.

Johnny


On Sat, Oct 19, 2013 at 7:44 PM, Jov <amutu@amutu.com> wr= ote:

jov


On Oct 20, 2013 8:07 AM, "Zhang Xiaoyu" <zhangxiaoyu912@gmail.com> w= rote:
>
> Hi, all,
> When I using Hive 0.12 with MySQL metastore. I set those properties in= hive-site.xml.=C2=A0
> datanucleus.autoCreateSchema =3D false
> datanucleus.autoCreateTables=3D false
you should set these properties to true=EF=BC=8Cthen hive will auto add new= column.

>
> In beeline, "show tables" is fine, but create a new table go= t below exception, any idea? Since I create the metastore table by the hive= 0.12 scheme script, it shouldn't complain about the missing columns in= metastore tables.
>
> Thanks,
> Johnny
>
> -------------------------
>
> FAILED: Error in metadata: MetaException(message:javax.jdo.JDODataStor= eException: Add request failed : INSERT INTO `COLUMNS_V2` (`CD_ID`,`FCOMMEN= T`,`COLUMN_NAME`,`TYPE_NAME`,`INTEGER_IDX`) VALUES (?,?,?,?,?)=C2=A0
> at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusE= xception(NucleusJDOHelper.java:422)
> at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDO= PersistenceManager.java:745)
> at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPer= sistenceManager.java:765)
> at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStor= e.java:638)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp= l.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc= essorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRa= wStore.java:111)
> at sun.proxy.$Proxy6.createTable(Unknown Source)
> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_ta= ble_core(HiveMetaStore.java:1081)
> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_ta= ble_with_environment_context(HiveMetaStore.java:1114)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp= l.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc= essorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(Retrying= HMSHandler.java:102)
> at sun.proxy.$Proxy8.create_table_with_environment_context(Unknown Sou= rce)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(Hi= veMetaStoreClient.java:464)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(Hi= veMetaStoreClient.java:453)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp= l.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc= essorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Ret= ryingMetaStoreClient.java:74)
> at sun.proxy.$Proxy10.createTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593)<= br> > at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:378= 4)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:256) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
> at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.= java:57)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
> at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation= .java:95)
> at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatemen= t(HiveSessionImpl.java:193)
> at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.= java:148)
> at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatemen= t(ThriftCLIService.java:203)
> at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteSta= tement.getResult(TCLIService.java:1133)
> at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteSta= tement.getResult(TCLIService.java:1118)
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)<= br> > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGICont= ainingProcessor.java:40)
> at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGICont= ainingProcessor.java:37)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfor= mation.java:1232)
> at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecu= re.java:526)
> at org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGICo= ntainingProcessor.java:37)
> at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThrea= dPoolServer.java:206)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecuto= r.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecut= or.java:615)
> at java.lang.Thread.run(Thread.java:722)
> NestedThrowablesStackTrace:
> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown col= umn 'FCOMMENT' in 'field list'
this column is new in 0.12,will be auto added if you set those settings to = true

> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Me= thod)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstru= ctorAccessorImpl.java:57)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Delegatin= gConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
> at com.mysql.jdbc.Util.getInstance(Util.java:386)
> at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
> at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
> at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
> at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
> at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
> at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
> at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.= java:2119)
> at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.ja= va:2415)
> at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.ja= va:2333)
> at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.ja= va:2318)
> at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(D= elegatingPreparedStatement.java:105)
> at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(D= elegatingPreparedStatement.java:105)
> at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQ= LController.java:407)
> at org.datanucleus.store.rdbms.scostore.RDBMSJoinListStore.internalAdd= (RDBMSJoinListStore.java:313)
> at org.datanucleus.store.rdbms.scostore.AbstractListStore.addAll(Abstr= actListStore.java:136)
> at org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(C= ollectionMapping.java:134)
> at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertReq= uest.java:517)
> at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDB= MSPersistenceHandler.java:163)
> at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RD= BMSPersistenceHandler.java:139)
> at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOSta= teManager.java:2371)
> at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManage= r.java:2347)
> at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManag= erImpl.java:1798)
> at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManag= erImpl.java:1886)
> at org.datanucleus.store.mapped.mapping.PersistableMapping.setObjectAs= Value(PersistableMapping.java:665)
> at org.datanucleus.store.mapped.mapping.PersistableMapping.setObject(P= ersistableMapping.java:425)
> at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjec= tField(ParameterSetter.java:197)
> at org.datanucleus.state.AbstractStateManager.providedObjectField(Abst= ractStateManager.java:1452)
> at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvid= eField(MStorageDescriptor.java)
> at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvid= eFields(MStorageDescriptor.java)
> at org.datanucleus.state.AbstractStateManager.provideFields(AbstractSt= ateManager.java:1520)
> at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertReq= uest.java:288)
> at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDB= MSPersistenceHandler.java:163)
> at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RD= BMSPersistenceHandler.java:139)
> at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOSta= teManager.java:2371)
> at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManage= r.java:2347)
> at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManag= erImpl.java:1798)
> at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManag= erImpl.java:1886)
> at org.datanucleus.store.mapped.mapping.PersistableMapping.setObjectAs= Value(PersistableMapping.java:665)
> at org.datanucleus.store.mapped.mapping.PersistableMapping.setObject(P= ersistableMapping.java:425)
> at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjec= tField(ParameterSetter.java:197)
> at org.datanucleus.state.AbstractStateManager.providedObjectField(Abst= ractStateManager.java:1452)
> at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTabl= e.java)
> at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTab= le.java)
> at org.datanucleus.state.AbstractStateManager.provideFields(AbstractSt= ateManager.java:1520)
> at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertReq= uest.java:288)
> at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDB= MSPersistenceHandler.java:163)
> at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RD= BMSPersistenceHandler.java:139)
> at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOSta= teManager.java:2371)
> at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManage= r.java:2347)
> at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManag= erImpl.java:1798)
> at org.datanucleus.ObjectManagerImpl.persistObjectWork(ObjectManagerIm= pl.java:1647)
> at org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.j= ava:1504)
> at org.datanucleus.MultithreadedObjectManager.persistObject(Multithrea= dedObjectManager.java:298)
> at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDO= PersistenceManager.java:740)
> at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPer= sistenceManager.java:765)
> at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStor= e.java:638)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp= l.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc= essorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRa= wStore.java:111)
> at sun.proxy.$Proxy6.createTable(Unknown Source)
> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_ta= ble_core(HiveMetaStore.java:1081)
> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_ta= ble_with_environment_context(HiveMetaStore.java:1114)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp= l.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc= essorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(Retrying= HMSHandler.java:102)
> at sun.proxy.$Proxy8.create_table_with_environment_context(Unknown Sou= rce)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(Hi= veMetaStoreClient.java:464)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(Hi= veMetaStoreClient.java:453)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp= l.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc= essorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Ret= ryingMetaStoreClient.java:74)
> at sun.proxy.$Proxy10.createTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593)<= br> > at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:378= 4)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:256) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
> at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.= java:57)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
> at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation= .java:95)
> at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatemen= t(HiveSessionImpl.java:193)
> at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.= java:148)
> at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatemen= t(ThriftCLIService.java:203)
> at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteSta= tement.getResult(TCLIService.java:1133)
> at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteSta= tement.getResult(TCLIService.java:1118)
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)<= br> > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGICont= ainingProcessor.java:40)
> at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGICont= ainingProcessor.java:37)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfor= mation.java:1232)
> at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecu= re.java:526)
> at org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGICo= ntainingProcessor.java:37)
> at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThrea= dPoolServer.java:206)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecuto= r.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecut= or.java:615)
> at java.lang.Thread.run(Thread.java:722)
> )
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.= exec.DDLTask
> org.apache.hive.service.cli.HiveSQLException: Error while processing s= tatement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hiv= e.ql.exec.DDLTask
> at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation= .java:97)
> at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatemen= t(HiveSessionImpl.java:193)
> at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.= java:148)
> at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatemen= t(ThriftCLIService.java:203)
> at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteSta= tement.getResult(TCLIService.java:1133)
> at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteSta= tement.getResult(TCLIService.java:1118)
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)<= br> > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGICont= ainingProcessor.java:40)
> at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGICont= ainingProcessor.java:37)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfor= mation.java:1232)
> at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecu= re.java:526)
> at org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGICo= ntainingProcessor.java:37)
> at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThrea= dPoolServer.java:206)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecuto= r.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecut= or.java:615)
> at java.lang.Thread.run(Thread.java:722)


--047d7b41432493ac5c04e9252f9b--