beam-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amit Sela <>
Subject Testing streaming pipeline on Dataflow
Date Sat, 04 Feb 2017 00:07:49 GMT
Hi Davor,

My team wants to try and test streaming pipelines on Dataflow (Beam
pipelines of course) and I was wondering how this works in terms of
UnboundedSources - Kafka/Pubsub ? We currently use Kafka, and I was
wondering if I could "record" a chunk of (public) data we use and import it
? can I throttle the input ? loop-feed (just to keep it going for a while) ?

I thought I'd ask you to give me good pointers instead of looking around
myself (that's me being lazy whenever I can ;-) ).

On a side-note, I followed-up on the package tracking link you sent me and
it looks like it went from Seattle to Seattle - it was like that after
about 3-4 days and I sampled it once a week to see if this got updated
somehow, though now I think something might be wrong.


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message