ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Gaurav Nagar (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-20574) Hive View 2.0 On SLES12-Table created on Upload throws error
Date Sun, 26 Mar 2017 10:21:42 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-20574?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Gaurav Nagar updated AMBARI-20574:
----------------------------------
    Attachment: AMBARI-20574_branch-2.5.patch

> Hive View 2.0 On SLES12-Table created on Upload throws error 
> -------------------------------------------------------------
>
>                 Key: AMBARI-20574
>                 URL: https://issues.apache.org/jira/browse/AMBARI-20574
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-views
>    Affects Versions: 2.5.0
>            Reporter: Anusha Bilgi
>            Assignee: Gaurav Nagar
>            Priority: Blocker
>             Fix For: 2.5.1
>
>         Attachments: AMBARI-20574_branch-2.5.patch
>
>
> Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution
Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDODataStoreException:
Put request failed : INSERT INTO "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES
(?,?,?) 
> 	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
> 	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:720)
> 	at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:921)
> 	at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
> 	at com.sun.proxy.$Proxy8.createTable(Unknown Source)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1534)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1579)
> 	at sun.reflect.GeneratedMethodAccessor94.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
> 	at com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown Source)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2167)
> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:735)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:723)
> 	at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:178)
> 	at com.sun.proxy.$Proxy13.createTable(Unknown Source)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:762)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4409)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:316)
> 	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> 	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
> 	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1748)
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1494)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1291)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1153)
> 	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197)
> 	at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)
> 	at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
> 	at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> NestedThrowablesStackTrace:
> org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO "SERDE_PARAMS"
("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES (?,?,?) 
> 	at org.datanucleus.store.rdbms.scostore.JoinMapStore.internalPut(JoinMapStore.java:1056)
> 	at org.datanucleus.store.rdbms.scostore.JoinMapStore.putAll(JoinMapStore.java:223)
> 	at org.datanucleus.store.rdbms.mapping.java.MapMapping.postInsert(MapMapping.java:135)
> 	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:522)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
> 	at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:3363)
> 	at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:3339)
> 	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
> 	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2171)
> 	at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObjectAsValue(PersistableMapping.java:567)
> 	at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObject(PersistableMapping.java:321)
> 	at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:191)
> 	at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1460)
> 	at org.datanucleus.state.StateManagerImpl.providedObjectField(StateManagerImpl.java:120)
> 	at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.dnProvideField(MStorageDescriptor.java)
> 	at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.dnProvideFields(MStorageDescriptor.java)
> 	at org.datanucleus.state.StateManagerImpl.provideFields(StateManagerImpl.java:1170)
> 	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:292)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
> 	at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:3363)
> 	at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:3339)
> 	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
> 	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2171)
> 	at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObjectAsValue(PersistableMapping.java:567)
> 	at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObject(PersistableMapping.java:321)
> 	at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:191)
> 	at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1460)
> 	at org.datanucleus.state.StateManagerImpl.providedObjectField(StateManagerImpl.java:120)
> 	at org.apache.hadoop.hive.metastore.model.MTable.dnProvideField(MTable.java)
> 	at org.apache.hadoop.hive.metastore.model.MTable.dnProvideFields(MTable.java)
> 	at org.datanucleus.state.StateManagerImpl.provideFields(StateManagerImpl.java:1170)
> 	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:292)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
> 	at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:3363)
> 	at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:3339)
> 	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
> 	at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1922)
> 	at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1777)
> 	at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
> 	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:715)
> 	at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:921)
> 	at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
> 	at com.sun.proxy.$Proxy8.createTable(Unknown Source)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1534)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1579)
> 	at sun.reflect.GeneratedMethodAccessor94.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
> 	at com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown Source)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2167)
> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:735)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:723)
> 	at sun.reflect.GeneratedMethodAccessor127.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:178)
> 	at com.sun.proxy.$Proxy13.createTable(Unknown Source)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:762)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4409)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:316)
> 	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> 	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
> 	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1748)
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1494)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1291)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1153)
> 	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197)
> 	at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)
> 	at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
> 	at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding
"UTF8": 0x00
> 	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2198)
> 	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1927)
> 	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
> 	at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:561)
> 	at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:419)
> 	at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:365)
> 	at com.jolbox.bonecp.PreparedStatementHandle.executeUpdate(PreparedStatementHandle.java:205)
> 	at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeUpdate(ParamLoggingPreparedStatement.java:393)
> 	at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:431)
> 	at org.datanucleus.store.rdbms.scostore.JoinMapStore.internalPut(JoinMapStore.java:1047)
> 	... 90 more
> )
> 	at org.apache.hive.jdbc.HiveStatement.waitForOperationToComplete(HiveStatement.java:348)
> 	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:251)
> 	at org.apache.ambari.view.hive20.HiveJdbcConnectionDelegate.execute(HiveJdbcConnectionDelegate.java:49)
> 	at org.apache.ambari.view.hive20.actor.StatementExecutor.runStatement(StatementExecutor.java:87)
> 	at org.apache.ambari.view.hive20.actor.StatementExecutor.handleMessage(StatementExecutor.java:70)
> 	at org.apache.ambari.view.hive20.actor.HiveActor.onReceive(HiveActor.java:38)
> 	at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
> 	at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
> 	at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
> 	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> 	at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> 	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> 	at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> 	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
> 	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message