flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From MAHESH KUMAR <r.mahesh.kumar....@gmail.com>
Subject Flink - Writing Test Case for the Datastream
Date Thu, 09 Mar 2017 21:09:34 GMT
Hi Team,

I am trying to write test cases to check whether the job is getting
executed as desired. I am using the Flink test util. I am trying to do a
end to end testing where Flink reads from a Kafka Queue, does some
processing and then writes the output to another topic of the Kafka Queue.
My objective is to read the message from the output topic and check if it
has the same message as expected.

I have got Zookeeper and Kafka configured for the test. When I start the
Flink Job, it never terminates since it's source is a Kafka Source. Is
there a way to run a job for a specific interval of time or how do I go
about testing this scenario. Is there any documentation/example for running
test cases such as these?

My code currently looks something like this:

class StreamingMultipleTest extends StreamingMultipleProgramsTestBase

@Before def initialize() = {
// Start Kafka, Zookeeper
// Call the run method of the Flink Class - FlinkClass.run()  // This class
contains the env.execute()

// My code does not execute any further since the previous call is never

@Test def Test1() = {
// Check if the Output Topic of the Kafka Queue is as expected -


@After def closeServices() = {
// Stop Zookeeper and Kafka


Thanks and Regards,


Mahesh Kumar Ravindranathan
Data Streaming Engineer
Oracle Marketing Cloud - Social Platform
Contact No:+1(720)492-4445

View raw message