kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From guozh...@apache.org
Subject kafka git commit: KAFKA-2313: javadoc fix for KafkaConsumer deserialization; reviewed by Guozhang Wang
Date Tue, 07 Jul 2015 20:37:10 GMT
Repository: kafka
Updated Branches:
  refs/heads/trunk 826276de1 -> f13dd8024


KAFKA-2313: javadoc fix for KafkaConsumer deserialization; reviewed by Guozhang Wang


Project: http://git-wip-us.apache.org/repos/asf/kafka/repo
Commit: http://git-wip-us.apache.org/repos/asf/kafka/commit/f13dd802
Tree: http://git-wip-us.apache.org/repos/asf/kafka/tree/f13dd802
Diff: http://git-wip-us.apache.org/repos/asf/kafka/diff/f13dd802

Branch: refs/heads/trunk
Commit: f13dd8024d5bc1c11587a3b539556ea01e2c84ca
Parents: 826276d
Author: Onur Karaman <okaraman@linkedin.com>
Authored: Tue Jul 7 13:36:55 2015 -0700
Committer: Guozhang Wang <wangguoz@gmail.com>
Committed: Tue Jul 7 13:36:55 2015 -0700

----------------------------------------------------------------------
 .../apache/kafka/clients/consumer/KafkaConsumer.java    | 12 ++++++------
 1 file changed, 6 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/kafka/blob/f13dd802/clients/src/main/java/org/apache/kafka/clients/consumer/KafkaConsumer.java
----------------------------------------------------------------------
diff --git a/clients/src/main/java/org/apache/kafka/clients/consumer/KafkaConsumer.java b/clients/src/main/java/org/apache/kafka/clients/consumer/KafkaConsumer.java
index 1f0e515..7aa0760 100644
--- a/clients/src/main/java/org/apache/kafka/clients/consumer/KafkaConsumer.java
+++ b/clients/src/main/java/org/apache/kafka/clients/consumer/KafkaConsumer.java
@@ -131,8 +131,8 @@ import static org.apache.kafka.common.utils.Utils.min;
  *     props.put(&quot;enable.auto.commit&quot;, &quot;true&quot;);
  *     props.put(&quot;auto.commit.interval.ms&quot;, &quot;1000&quot;);
  *     props.put(&quot;session.timeout.ms&quot;, &quot;30000&quot;);
- *     props.put(&quot;key.serializer&quot;, &quot;org.apache.kafka.common.serializers.StringSerializer&quot;);
- *     props.put(&quot;value.serializer&quot;, &quot;org.apache.kafka.common.serializers.StringSerializer&quot;);
+ *     props.put(&quot;key.deserializer&quot;, &quot;org.apache.kafka.common.serialization.StringDeserializer&quot;);
+ *     props.put(&quot;value.deserializer&quot;, &quot;org.apache.kafka.common.serialization.StringDeserializer&quot;);
  *     KafkaConsumer&lt;String, String&gt; consumer = new KafkaConsumer&lt;String,
String&gt;(props);
  *     consumer.subscribe(&quot;foo&quot;, &quot;bar&quot;);
  *     while (true) {
@@ -159,8 +159,8 @@ import static org.apache.kafka.common.utils.Utils.min;
  * to it. If it stops heartbeating for a period of time longer than <code>session.timeout.ms</code>
then it will be
  * considered dead and it's partitions will be assigned to another process.
  * <p>
- * The serializers settings specify how to turn the objects the user provides into bytes.
By specifying the string
- * serializers we are saying that our record's key and value will just be simple strings.
+ * The deserializer settings specify how to turn bytes into objects. For example, by specifying
string deserializers, we
+ * are saying that our record's key and value will just be simple strings.
  * 
  * <h4>Controlling When Messages Are Considered Consumed</h4>
  * 
@@ -183,8 +183,8 @@ import static org.apache.kafka.common.utils.Utils.min;
  *     props.put(&quot;enable.auto.commit&quot;, &quot;false&quot;);
  *     props.put(&quot;auto.commit.interval.ms&quot;, &quot;1000&quot;);
  *     props.put(&quot;session.timeout.ms&quot;, &quot;30000&quot;);
- *     props.put(&quot;key.serializer&quot;, &quot;org.apache.kafka.common.serializers.StringSerializer&quot;);
- *     props.put(&quot;value.serializer&quot;, &quot;org.apache.kafka.common.serializers.StringSerializer&quot;);
+ *     props.put(&quot;key.deserializer&quot;, &quot;org.apache.kafka.common.serialization.StringDeserializer&quot;);
+ *     props.put(&quot;value.deserializer&quot;, &quot;org.apache.kafka.common.serialization.StringDeserializer&quot;);
  *     KafkaConsumer&lt;String, String&gt; consumer = new KafkaConsumer&lt;String,
String&gt;(props);
  *     consumer.subscribe(&quot;foo&quot;, &quot;bar&quot;);
  *     int commitInterval = 200;


Mime
View raw message