flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nico Kruber <n...@data-artisans.com>
Subject Re: Using HiveBolt from storm-hive with Flink-Storm compatibility wrapper
Date Mon, 25 Sep 2017 14:16:33 GMT
Hi Federico,
I also did not find any implementation of a hive sink, nor much details on this 
topic in general. Let me forward this to Timo and Fabian (cc'd) who may know 
more.

Nico

On Friday, 22 September 2017 12:14:32 CEST Federico D'Ambrosio wrote:
> Hello everyone,
> 
> I'd like to use the HiveBolt from storm-hive inside a flink job using the
> Flink-Storm compatibility layer but I'm not sure how to integrate it. Let
> me explain, I would have the following:
> 
> val mapper = ...
> 
> val hiveOptions = ...
> 
> streamByID
>   .transform[OUT]("hive-sink", new BoltWrapper[IN, OUT](new
> HiveBolt(hiveOptions)))
> 
> where streamByID is a DataStream[Event].
> 
> What would be the IN and OUT types? HiveBolt executes on a storm Tuple, so,
> I'd think that In should be an Event "tuple-d" ( event => (field1, field2,
> field3 ...) ), while OUT, since I don't want the stream to keep flowing
> would be null or None?
> 
> Alternatively, do you know any implementation of an hive sink in Flink?
> Other than the adaptation of the said HiveBolt in a RichSinkFunction?
> 
> Thanks for your attention,
>  Federico



Mime
View raw message