hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "brucewoo (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HIVE-16988) if partition column type is boolean, Streaming api AbstractRecordWriter.getPathForEndPoint NoSuchObjectException: partition values=[Y, 2017-06-29 14:32:36.508]
Date Thu, 29 Jun 2017 07:23:00 GMT
brucewoo created HIVE-16988:
-------------------------------

             Summary: if partition column type is boolean, Streaming api AbstractRecordWriter.getPathForEndPoint
NoSuchObjectException: partition values=[Y, 2017-06-29 14:32:36.508]
                 Key: HIVE-16988
                 URL: https://issues.apache.org/jira/browse/HIVE-16988
             Project: Hive
          Issue Type: Bug
            Reporter: brucewoo


org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083',
database='dw_subject', table='alls', partitionVals=[Y, 2017-06-29 14:32:36.508] }
	at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:53) ~[nifi-hive-processors-1.1.2.jar:1.1.2]
	at org.apache.nifi.processors.hive.InsertTdfHive2.getOrCreateWriter(InsertTdfHive2.java:971)
[nifi-hive-processors-1.1.2.jar:1.1.2]
	at org.apache.nifi.processors.hive.InsertTdfHive2.putStreamingHive(InsertTdfHive2.java:872)
[nifi-hive-processors-1.1.2.jar:1.1.2]
	at org.apache.nifi.processors.hive.InsertTdfHive2.onTrigger(InsertTdfHive2.java:411) [nifi-hive-processors-1.1.2.jar:1.1.2]
	at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) [nifi-api-1.1.2.jar:1.1.2]
	at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1099)
[nifi-framework-core-1.1.2.jar:1.1.2]
	at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
[nifi-framework-core-1.1.2.jar:1.1.2]
	at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:1)
[nifi-framework-core-1.1.2.jar:1.1.2]
	at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
[nifi-framework-core-1.1.2.jar:1.1.2]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_131]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
[na:1.8.0_131]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
[na:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_131]
	at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Caused by: org.apache.hive.hcatalog.streaming.StreamingException: partition values=[Y, 2017-06-29
14:32:36.508]. Unable to get path for end point: [Y, 2017-06-29 14:32:36.508]
	at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.getPathForEndPoint(AbstractRecordWriter.java:268)
~[hive-hcatalog-streaming-2.0.0.jar:2.0.0]
	at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.<init>(AbstractRecordWriter.java:79)
~[hive-hcatalog-streaming-2.0.0.jar:2.0.0]
	at org.apache.hive.hcatalog.streaming.DelimitedInputWriter.<init>(DelimitedInputWriter.java:121)
~[hive-hcatalog-streaming-2.0.0.jar:2.0.0]
	at org.apache.hive.hcatalog.streaming.DelimitedInputWriter.<init>(DelimitedInputWriter.java:98)
~[hive-hcatalog-streaming-2.0.0.jar:2.0.0]
	at org.apache.hive.hcatalog.streaming.DelimitedInputWriter.<init>(DelimitedInputWriter.java:79)
~[hive-hcatalog-streaming-2.0.0.jar:2.0.0]
	at org.apache.nifi.util.hive.HiveWriter.getDelimitedInputWriter(HiveWriter.java:60) ~[nifi-hive-processors-1.1.2.jar:1.1.2]
	at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:46) ~[nifi-hive-processors-1.1.2.jar:1.1.2]
	... 15 common frames omitted
Caused by: org.apache.hadoop.hive.metastore.api.NoSuchObjectException: partition values=[Y,
2017-06-29 14:32:36.508]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_result$get_partition_resultStandardScheme.read(ThriftHiveMetastore.java)
~[hive-exec-2.0.0.jar:2.0.0]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_result$get_partition_resultStandardScheme.read(ThriftHiveMetastore.java)
~[hive-exec-2.0.0.jar:2.0.0]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_partition_result.read(ThriftHiveMetastore.java)
~[hive-exec-2.0.0.jar:2.0.0]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) ~[libthrift-0.9.3.jar:0.9.3]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_partition(ThriftHiveMetastore.java:1924)
~[hive-exec-2.0.0.jar:2.0.0]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_partition(ThriftHiveMetastore.java:1909)
~[hive-exec-2.0.0.jar:2.0.0]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartition(HiveMetaStoreClient.java:1231)
~[hive-exec-2.0.0.jar:2.0.0]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[na:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:152)
~[hive-exec-2.0.0.jar:2.0.0]
	at com.sun.proxy.$Proxy112.getPartition(Unknown Source) ~[na:na]
	at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.getPathForEndPoint(AbstractRecordWriter.java:263)
~[hive-hcatalog-streaming-2.0.0.jar:2.0.0]
	... 21 common frames omitted




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message