hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From SF Hadoop <sfhad...@gmail.com>
Subject Re: exception when using Hive 0.12 with MySQL metastore
Date Wed, 23 Oct 2013 21:27:15 GMT
Where is package.jdo located?  The one that you changed?

Thanks.



On Wed, Oct 23, 2013 at 1:22 PM, Timothy Potter <thelabdude@gmail.com>wrote:

> I updated package.jdo to use COMMENT instead of FCOMMENT (which is
> something I had to do for HCatalog a long while back) ... may not be the
> "right" solution but worked for me.
>
> Cheers,
> Tim
>
>
> On Wed, Oct 23, 2013 at 2:09 PM, SF Hadoop <sfhadoop@gmail.com> wrote:
>
>> Has anyone come up with further information on this issue?  I am
>> experiencing the same thing.
>>
>> Hive is set to auto-create if not exist yet it still fails.  I cannot
>> create *any* table at all.
>>
>> Any help is appreciated.
>>
>>
>>
>>
>> On Sat, Oct 19, 2013 at 11:56 PM, Jov <amutu@amutu.com> wrote:
>>
>>> can you confirm the script content?there may be bug,you can open a issue.
>>>
>>> jov
>>>
>>> On Oct 20, 2013 1:11 PM, "Zhang Xiaoyu" <zhangxiaoyu912@gmail.com>
>>> wrote:
>>> >
>>> > Hi, Jov,
>>> > Thanks. I understand turn on those two properties resolve the
>>> problems. But I run the hive 0.12 scheme script. I assume it should create
>>> all required tables.
>>> >
>>> > Johnny
>>> >
>>> >
>>> > On Sat, Oct 19, 2013 at 7:44 PM, Jov <amutu@amutu.com> wrote:
>>> >>
>>> >> jov
>>> >>
>>> >>
>>> >> On Oct 20, 2013 8:07 AM, "Zhang Xiaoyu" <zhangxiaoyu912@gmail.com>
>>> wrote:
>>> >> >
>>> >> > Hi, all,
>>> >> > When I using Hive 0.12 with MySQL metastore. I set those properties
>>> in hive-site.xml.
>>> >> > datanucleus.autoCreateSchema = false
>>> >> > datanucleus.autoCreateTables= false
>>> >> you should set these properties to trueļ¼Œthen hive will auto add new
>>> column.
>>> >>
>>> >> >
>>> >> > In beeline, "show tables" is fine, but create a new table got below
>>> exception, any idea? Since I create the metastore table by the hive 0.12
>>> scheme script, it shouldn't complain about the missing columns in metastore
>>> tables.
>>> >> >
>>> >> > Thanks,
>>> >> > Johnny
>>> >> >
>>> >> > -------------------------
>>> >> >
>>> >> > FAILED: Error in metadata:
>>> MetaException(message:javax.jdo.JDODataStoreException: Add request failed :
>>> INSERT INTO `COLUMNS_V2`
>>> (`CD_ID`,`FCOMMENT`,`COLUMN_NAME`,`TYPE_NAME`,`INTEGER_IDX`) VALUES
>>> (?,?,?,?,?)
>>> >> > at
>>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:422)
>>> >> > at
>>> org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:745)
>>> >> > at
>>> org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:765)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:638)
>>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >> > at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >> > at java.lang.reflect.Method.invoke(Method.java:601)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
>>> >> > at sun.proxy.$Proxy6.createTable(Unknown Source)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1081)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1114)
>>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >> > at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >> > at java.lang.reflect.Method.invoke(Method.java:601)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)
>>> >> > at sun.proxy.$Proxy8.create_table_with_environment_context(Unknown
>>> Source)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:464)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:453)
>>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >> > at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >> > at java.lang.reflect.Method.invoke(Method.java:601)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
>>> >> > at sun.proxy.$Proxy10.createTable(Unknown Source)
>>> >> > at
>>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593)
>>> >> > at
>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3784)
>>> >> > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:256)
>>> >> > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
>>> >> > at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>> >> > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
>>> >> > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
>>> >> > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
>>> >> > at
>>> org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:95)
>>> >> > at
>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:193)
>>> >> > at
>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:148)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:203)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1133)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1118)
>>> >> > at
>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>> >> > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:40)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:37)
>>> >> > at java.security.AccessController.doPrivileged(Native Method)
>>> >> > at javax.security.auth.Subject.doAs(Subject.java:415)
>>> >> > at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>>> >> > at
>>> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:37)
>>> >> > at
>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>>> >> > at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> >> > at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> >> > at java.lang.Thread.run(Thread.java:722)
>>> >> > NestedThrowablesStackTrace:
>>> >> > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown
>>> column 'FCOMMENT' in 'field list'
>>> >> this column is new in 0.12,will be auto added if you set those
>>> settings to true
>>> >>
>>> >> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>> >> > at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> >> > at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>>> >> > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
>>> >> > at com.mysql.jdbc.Util.getInstance(Util.java:386)
>>> >> > at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
>>> >> > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
>>> >> > at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
>>> >> > at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
>>> >> > at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
>>> >> > at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
>>> >> > at
>>> com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
>>> >> > at
>>> com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
>>> >> > at
>>> com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2333)
>>> >> > at
>>> com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2318)
>>> >> > at
>>> org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
>>> >> > at
>>> org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
>>> >> > at
>>> org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:407)
>>> >> > at
>>> org.datanucleus.store.rdbms.scostore.RDBMSJoinListStore.internalAdd(RDBMSJoinListStore.java:313)
>>> >> > at
>>> org.datanucleus.store.rdbms.scostore.AbstractListStore.addAll(AbstractListStore.java:136)
>>> >> > at
>>> org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(CollectionMapping.java:134)
>>> >> > at
>>> org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:517)
>>> >> > at
>>> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:163)
>>> >> > at
>>> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:139)
>>> >> > at
>>> org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:2371)
>>> >> > at
>>> org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:2347)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1798)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1886)
>>> >> > at
>>> org.datanucleus.store.mapped.mapping.PersistableMapping.setObjectAsValue(PersistableMapping.java:665)
>>> >> > at
>>> org.datanucleus.store.mapped.mapping.PersistableMapping.setObject(PersistableMapping.java:425)
>>> >> > at
>>> org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:197)
>>> >> > at
>>> org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1452)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvideField(MStorageDescriptor.java)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvideFields(MStorageDescriptor.java)
>>> >> > at
>>> org.datanucleus.state.AbstractStateManager.provideFields(AbstractStateManager.java:1520)
>>> >> > at
>>> org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:288)
>>> >> > at
>>> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:163)
>>> >> > at
>>> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:139)
>>> >> > at
>>> org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:2371)
>>> >> > at
>>> org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:2347)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1798)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1886)
>>> >> > at
>>> org.datanucleus.store.mapped.mapping.PersistableMapping.setObjectAsValue(PersistableMapping.java:665)
>>> >> > at
>>> org.datanucleus.store.mapped.mapping.PersistableMapping.setObject(PersistableMapping.java:425)
>>> >> > at
>>> org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:197)
>>> >> > at
>>> org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1452)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.java)
>>> >> > at
>>> org.datanucleus.state.AbstractStateManager.provideFields(AbstractStateManager.java:1520)
>>> >> > at
>>> org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:288)
>>> >> > at
>>> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:163)
>>> >> > at
>>> org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:139)
>>> >> > at
>>> org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:2371)
>>> >> > at
>>> org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:2347)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1798)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObjectWork(ObjectManagerImpl.java:1647)
>>> >> > at
>>> org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1504)
>>> >> > at
>>> org.datanucleus.MultithreadedObjectManager.persistObject(MultithreadedObjectManager.java:298)
>>> >> > at
>>> org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:740)
>>> >> > at
>>> org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:765)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:638)
>>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >> > at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >> > at java.lang.reflect.Method.invoke(Method.java:601)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
>>> >> > at sun.proxy.$Proxy6.createTable(Unknown Source)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1081)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1114)
>>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >> > at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >> > at java.lang.reflect.Method.invoke(Method.java:601)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)
>>> >> > at sun.proxy.$Proxy8.create_table_with_environment_context(Unknown
>>> Source)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:464)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:453)
>>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >> > at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >> > at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >> > at java.lang.reflect.Method.invoke(Method.java:601)
>>> >> > at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
>>> >> > at sun.proxy.$Proxy10.createTable(Unknown Source)
>>> >> > at
>>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593)
>>> >> > at
>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3784)
>>> >> > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:256)
>>> >> > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
>>> >> > at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>> >> > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
>>> >> > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
>>> >> > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
>>> >> > at
>>> org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:95)
>>> >> > at
>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:193)
>>> >> > at
>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:148)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:203)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1133)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1118)
>>> >> > at
>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>> >> > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:40)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:37)
>>> >> > at java.security.AccessController.doPrivileged(Native Method)
>>> >> > at javax.security.auth.Subject.doAs(Subject.java:415)
>>> >> > at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>>> >> > at
>>> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:37)
>>> >> > at
>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>>> >> > at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> >> > at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> >> > at java.lang.Thread.run(Thread.java:722)
>>> >> > )
>>> >> > FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>> >> > org.apache.hive.service.cli.HiveSQLException: Error while
>>> processing statement: FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>> >> > at
>>> org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:97)
>>> >> > at
>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:193)
>>> >> > at
>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:148)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:203)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1133)
>>> >> > at
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1118)
>>> >> > at
>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>> >> > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:40)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:37)
>>> >> > at java.security.AccessController.doPrivileged(Native Method)
>>> >> > at javax.security.auth.Subject.doAs(Subject.java:415)
>>> >> > at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>>> >> > at
>>> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)
>>> >> > at
>>> org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:37)
>>> >> > at
>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>>> >> > at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> >> > at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> >> > at java.lang.Thread.run(Thread.java:722)
>>> >
>>> >
>>>
>>>
>>
>

Mime
View raw message