nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <mohit.j...@open-insights.co.in>
Subject RE: Auto Partitioning in PuthiveStreming Processor
Date Wed, 25 Oct 2017 10:06:41 GMT
Hi,

JSONDPI3 is an external table. Though I am facing the same issue in both the managed and external
table.  Sharing with you the details of the JSONDPI3.





 

Thanks.

From: Pierre Villard [mailto:pierre.villard.fr@gmail.com] 
Sent: 25 October 2017 15:30
To: users@nifi.apache.org
Subject: Re: Auto Partitioning in PuthiveStreming Processor

 

In your stack trace / configuration, I see that the table name is jsondpi3, but the table
in your last message is different. Are the descriptions of both tables the same?

Pierre

 

2017-10-25 11:50 GMT+02:00 <mohit.jain@open-insights.co.in <mailto:mohit.jain@open-insights.co.in>
>:

Hi Pierre,

 

Pleae find the desc formatted :

 



 

Regards,

Mohit

From: Pierre Villard [mailto:pierre.villard.fr@gmail.com <mailto:pierre.villard.fr@gmail.com>
] 
Sent: 25 October 2017 15:14
To: users@nifi.apache.org <mailto:users@nifi.apache.org> 
Subject: Re: Auto Partitioning in PuthiveStreming Processor

 

Hi Mohit,

Can you share a 'describe formatted' of your table in Hive?

Thanks!

 

2017-10-25 9:29 GMT+02:00 <mohit.jain@open-insights.co.in <mailto:mohit.jain@open-insights.co.in>
>:

Hi All,

 

I am trying to write data to dynamic partitions in Hive using PutHiveStreaming Processor.


It fails in case of auto partitioning. Though it is working fine if partition is already created.


 

Nifi throws the following error in case of auto partitioning:-

 

12:48:14 ISTERROR80ee3b56-68a5-13e1-3251-9cc44bebddf4

PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Failed to create HiveWriter for
endpoint: {metaStoreUri='thrift://localhost:9083', database='test', table='jsondpi3', partitionVals=[356945013]
}: org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083',
database='test', table='jsondpi3', partitionVals=[356945013] }

12:48:14 ISTERROR80ee3b56-68a5-13e1-3251-9cc44bebddf4

PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Error connecting to Hive endpoint:
table jsondpi3 at thrift://localhost:9083

12:48:14 ISTERROR80ee3b56-68a5-13e1-3251-9cc44bebddf4

PutHiveStreaming[id=80ee3b56-68a5-13e1-3251-9cc44bebddf4] Hive Streaming connect/write error,
flow file will be penalized and routed to retry. org.apache.nifi.util.hive.HiveWriter$ConnectFailure:
Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083', database='test', table='jsondpi3',
partitionVals=[356945013] }: org.apache.nifi.processors.hive.PutHiveStreaming$ShouldRetryException:
Hive Streaming connect/write error, flow file will be penalized and routed to retry. org.apache.nifi.util.hive.HiveWriter$ConnectFailure:
Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083', database='test', table='jsondpi3',
partitionVals=[356945013] }

 

Nifi app log shows the following log :

 

org.apache.nifi.processors.hive.PutHiveStreaming$ShouldRetryException: Hive Streaming connect/write
error, flow file will be penalized and routed to retry. org.apache.nifi.util.hive.HiveWriter$ConnectFailure:
Failed connecting to EndPoint {metaStoreUri='thrift://localhost:9083', database='test', table='jsondpi3',
partitionVals=[356945013] }

               at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordsError$1(PutHiveStreaming.java:527)

               at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54)

               at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordError$2(PutHiveStreaming.java:545)

               at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148)

               at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHiveStreaming.java:677)

               at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2174)

               at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2144)

               at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:631)

               at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:555)

               at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)

               at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)

               at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:555)

               at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1119)

               at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)

               at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)

               at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)

               at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

               at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)

               at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)

               at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)

               at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

               at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

               at java.lang.Thread.run(Thread.java:748)

 

 

Error in detail is attached. 

Property details Is also attached.

 

Please let me know what I am doing wrong.

 

Regards,

Mohit Jain

 

 


Mime
View raw message