spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gaborgsomogyi <...@git.apache.org>
Subject [GitHub] spark pull request #20703: [SPARK-19185][SS] Make Kafka consumer cache confi...
Date Thu, 01 Mar 2018 19:21:25 GMT
Github user gaborgsomogyi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20703#discussion_r171665564
  
    --- Diff: docs/structured-streaming-kafka-integration.md ---
    @@ -376,6 +383,8 @@ The following configurations are optional:
     </tr>
     </table>
     
    +If you would like to disable the caching for Kafka consumers, you can set `spark.streaming.kafka.consumer.cache.enabled`
to `false`. Disabling the cache may be needed to workaround the problem described in SPARK-19185.
This property may be removed in later versions of Spark, once SPARK-19185 is resolved.
    --- End diff --
    
    This is the description in dstreams and I've made it consistent by copying it. Should
we remove it then from dstreams as well? Maybe I'm too optimistic but considering this as
a work-around as it's stated in the descriptions.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message