spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Cody Koeninger (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-12177) Update KafkaDStreams to new Kafka 0.10 Consumer API
Date Wed, 15 Jun 2016 02:42:30 GMT

    [ https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15331021#comment-15331021
] 

Cody Koeninger commented on SPARK-12177:
----------------------------------------

[~jinxliu@ebay.com]

1. I'm already looking at that test failure, will update once I know what's going on.

2.  I'm really strongly against trying to hide the kafka consumer from users for 0.10, I don't
want to be in the business of anticipating all the ways people will use it, nor the ways it
may change.  The 0.10 consumer isn't particularly difficult to use, the most basic construction
of it is just

new KafkaConsumer[String, String](kafkaParams)
consumer.subscribe(topics)

You don't need to know anything about partitionInfo, unless you want/need to.

> Update KafkaDStreams to new Kafka 0.10 Consumer API
> ---------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not compatible with
old one. So, I added new consumer api. I made separate classes in package org.apache.spark.streaming.kafka.v09
with changed API. I didn't remove old classes for more backward compatibility. User will not
need to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message