Cliff,

Thanks for the response. Well, I do agree that its simple and seamless. In my case, I am able to upsert ~25000 events/sec into Kudu. But, I am facing the problem when any of the Kudu Tablet or master server is down. I am not able to get a hold of the exception from client. The client is going into an infinite loop trying to connect to Kudu. Meanwhile, I am loosing my records. I tried handling the errors through getPendingErrors() but still it is helpless. I am using AsyncKuduClient to establish the connection and retrieving the syncClient from the Async to open the session and table. Any help? 

Thanks,
Ravi

On 26 February 2018 at 18:00, Cliff Resnick <cresny@gmail.com> wrote:
While I can't speak for Spark, we do use the client API from Flink streaming and it's simple and seamless. It's especially nice if you require an Upsert semantic.

On Feb 26, 2018 7:51 PM, "Ravi Kanth" <ravikanth.4b0@gmail.com> wrote:
Hi,

Anyone using Spark Streaming to ingest data into Kudu and using Kudu Client API to do so rather than the traditional KuduContext API? I am stuck at a point and couldn't find a solution. 

Thanks,
Ravi