beam-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Antony Mayi <antonym...@yahoo.com>
Subject appending beam pipeline to spark job
Date Wed, 10 May 2017 08:16:54 GMT
I've got a (dirty) usecase where I have existing spark batch job which produces an output that
I would like to feed into my beam pipeline (assuming running on SparkRunner). I was trying
to run it as one job (the output is reduced so not a big data hence ok to do something like
Create.of(rdd.collect())) but that's failing because of the two separate spark contexts.
Is it possible to build the beam pipeline on existing spark context?
thx,Antony.
Mime
View raw message