spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Cody Koeninger (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-9780) In case of invalid initialization of KafkaDirectStream, NPE is thrown
Date Tue, 11 Aug 2015 14:43:45 GMT

    [ https://issues.apache.org/jira/browse/SPARK-9780?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14681886#comment-14681886
] 

Cody Koeninger commented on SPARK-9780:
---------------------------------------

Makes sense, traveling currently but I'll put in a PR

> In case of invalid initialization of KafkaDirectStream, NPE is thrown
> ---------------------------------------------------------------------
>
>                 Key: SPARK-9780
>                 URL: https://issues.apache.org/jira/browse/SPARK-9780
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.3.1, 1.4.1
>            Reporter: Grigory Turunov
>            Priority: Minor
>
> [o.a.s.streaming.kafka.KafkaRDD.scala#L143|https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala#L143]
> In initialization of KafkaRDDIterator, there is an addition of TaskCompletionListener
to the context, which calls close() to the consumer, which is not initialized yet (and will
be initialized 12 lines after that).
> If something happens in this 12 lines (in my case there was a private constructor for
valueDecoder), an Exception, which is thrown, triggers context.markTaskCompleted() in
> [o.a.s.scheduler.Task.scala#L90|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/Task.scala#L90]
> which throws NullPointerException, when tries to call close() for non-initialized consumer.
> This masks original exception - so it is very hard to understand, what is happening.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message