kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From guozh...@apache.org
Subject [kafka] branch trunk updated: MINOR: Remove deprecated KafkaStreams constructors in docs (#5118)
Date Mon, 04 Jun 2018 20:43:47 GMT
This is an automated email from the ASF dual-hosted git repository.

guozhang pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/kafka.git


The following commit(s) were added to refs/heads/trunk by this push:
     new 718d6f2  MINOR: Remove deprecated KafkaStreams constructors in docs (#5118)
718d6f2 is described below

commit 718d6f2475d377fd220da9898d25f7c8191f98cd
Author: Guozhang Wang <wangguoz@gmail.com>
AuthorDate: Mon Jun 4 13:43:20 2018 -0700

    MINOR: Remove deprecated KafkaStreams constructors in docs (#5118)
    
    Reviewers: Bill Bejeck <bill@confluent.io>, Matthias J. Sax <matthias@confluent.io>
---
 docs/streams/developer-guide/config-streams.html   | 20 ++-----------
 docs/streams/developer-guide/datatypes.html        |  6 ++--
 docs/streams/developer-guide/dsl-api.html          | 10 +++----
 .../developer-guide/interactive-queries.html       | 15 +++++-----
 docs/streams/developer-guide/memory-mgmt.html      | 14 ++++-----
 docs/streams/developer-guide/security.html         |  3 +-
 docs/streams/developer-guide/testing.html          | 34 +++++++++++-----------
 docs/streams/developer-guide/write-streams.html    |  7 ++---
 docs/streams/index.html                            | 28 +++++++++---------
 docs/streams/tutorial.html                         |  4 +--
 .../org/apache/kafka/streams/KafkaStreams.java     |  3 +-
 11 files changed, 61 insertions(+), 83 deletions(-)

diff --git a/docs/streams/developer-guide/config-streams.html b/docs/streams/developer-guide/config-streams.html
index 753ddc7..2b6ade5 100644
--- a/docs/streams/developer-guide/config-streams.html
+++ b/docs/streams/developer-guide/config-streams.html
@@ -34,13 +34,11 @@
 
   <div class="section" id="configuring-a-streams-application">
     <span id="streams-developer-guide-configuration"></span><h1>Configuring a Streams Application<a class="headerlink" href="#configuring-a-streams-application" title="Permalink to this headline"></a></h1>
-    <p>Kafka and Kafka Streams configuration options must be configured before using Streams. You can configure Kafka Streams by specifying parameters in a <code class="docutils literal"><span class="pre">StreamsConfig</span></code> instance.</p>
+    <p>Kafka and Kafka Streams configuration options must be configured before using Streams. You can configure Kafka Streams by specifying parameters in a <code class="docutils literal"><span class="pre">java.util.Properties</span></code> instance.</p>
     <ol class="arabic">
       <li><p class="first">Create a <code class="docutils literal"><span class="pre">java.util.Properties</span></code> instance.</p>
       </li>
-      <li><p class="first">Set the <a class="reference internal" href="#streams-developer-guide-required-configs"><span class="std std-ref">parameters</span></a>.</p>
-      </li>
-      <li><p class="first">Construct a <code class="docutils literal"><span class="pre">StreamsConfig</span></code> instance from the <code class="docutils literal"><span class="pre">Properties</span></code> instance. For example:</p>
+      <li><p class="first">Set the <a class="reference internal" href="#streams-developer-guide-required-configs"><span class="std std-ref">parameters</span></a>. For example:</p>
         <div class="highlight-java"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">java.util.Properties</span><span class="o">;</span>
 <span class="kn">import</span> <span class="nn">org.apache.kafka.streams.StreamsConfig</span><span class="o">;</span>
 
@@ -50,9 +48,6 @@
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">BOOTSTRAP_SERVERS_CONFIG</span><span class="o">,</span> <span class="s">&quot;kafka-broker1:9092&quot;</span><span class="o">);</span>
 <span class="c1">// Any further settings</span>
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(...</span> <span class="o">,</span> <span class="o">...);</span>
-
-<span class="c1">// Create an instance of StreamsConfig from the Properties instance</span>
-<span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="k">new</span> <span class="n">StreamsConfig</span><span class="o">(</span><span class="n">settings</span><span class="o">);</span>
 </pre></div>
         </div>
       </li>
@@ -520,15 +515,13 @@
       </div>
       <div class="section" id="kafka-consumers-and-producer-configuration-parameters">
         <h3><a class="toc-backref" href="#id16">Kafka consumers and producer configuration parameters</a><a class="headerlink" href="#kafka-consumers-and-producer-configuration-parameters" title="Permalink to this headline"></a></h3>
-        <p>You can specify parameters for the Kafka <a class="reference external" href="../../../javadoc/org/apache/kafka/clients/consumer/package-summary.html">consumers</a> and <a class="reference external" href="../../../javadoc/org/apache/kafka/clients/producer/package-summary.html">producers</a> that are used internally.  The consumer and producer settings
-          are defined by specifying parameters in a <code class="docutils literal"><span class="pre">StreamsConfig</span></code> instance.</p>
+        <p>You can specify parameters for the Kafka <a class="reference external" href="../../../javadoc/org/apache/kafka/clients/consumer/package-summary.html">consumers</a> and <a class="reference external" href="../../../javadoc/org/apache/kafka/clients/producer/package-summary.html">producers</a> that are used internally.
         <p>In this example, the Kafka <a class="reference external" href="../../../javadoc/org/apache/kafka/clients/consumer/ConsumerConfig.html#SESSION_TIMEOUT_MS_CONFIG">consumer session timeout</a> is configured to be 60000 milliseconds in the Streams settings:</p>
         <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">Properties</span> <span class="n">streamsSettings</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
 <span class="c1">// Example of a &quot;normal&quot; setting for Kafka Streams</span>
 <span class="n">streamsSettings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">BOOTSTRAP_SERVERS_CONFIG</span><span class="o">,</span> <span class="s">&quot;kafka-broker-01:9092&quot;</span><span class="o">);</span>
 <span class="c1">// Customize the Kafka consumer settings of your Streams application</span>
 <span class="n">streamsSettings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">ConsumerConfig</span><span class="o">.</span><span class="na">SESSION_TIMEOUT_MS_CONFIG</span><span class="o">,</span> <span class="mi">60000</span><span class="o">);</span>
-<span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="k">new</span> <span class="n">StreamsConfig</span><span class="o">(</span><span class="n">streamsSettings</span><span class="o">);</span>
 </pre></div>
         </div>
         <div class="section" id="naming">
@@ -706,18 +699,11 @@
           <h4><a class="toc-backref" href="#id23">replication.factor</a><a class="headerlink" href="#id2" title="Permalink to this headline"></a></h4>
           <blockquote>
             <div>See the <a class="reference internal" href="#replication-factor-parm"><span class="std std-ref">description here</span></a>.</div></blockquote>
-          <p>You define these settings via <code class="docutils literal"><span class="pre">StreamsConfig</span></code>:</p>
           <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">Properties</span> <span class="n">streamsSettings</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
 <span class="n">streamsSettings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">REPLICATION_FACTOR_CONFIG</span><span class="o">,</span> <span class="mi">3</span><span class="o">);</span>
 <span class="n">streamsSettings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">producerPrefix</span><span class="o">(</span><span class="n">ProducerConfig</span><span class="o">.</span><span class="na">ACKS_CONFIG</span><span class="o">),</span> <span class="s">&quot;all&quot;</span><span class="o">);</span>
 </pre></div>
           </div>
-          <div class="admonition note">
-            <p class="first admonition-title">Note</p>
-            <p class="last">A future version of Kafka Streams will allow developers to set their own app-specific configuration settings through
-              <code class="docutils literal"><span class="pre">StreamsConfig</span></code> as well, which can then be accessed through
-              <a class="reference external" href="../../../javadoc/org/apache/kafka/streams/processor/ProcessorContext.html">ProcessorContext</a>.</p>
-</div>
 </div>
 </div>
 </div>
diff --git a/docs/streams/developer-guide/datatypes.html b/docs/streams/developer-guide/datatypes.html
index d8d7b4c..1120815 100644
--- a/docs/streams/developer-guide/datatypes.html
+++ b/docs/streams/developer-guide/datatypes.html
@@ -36,7 +36,7 @@
     <p>Every Kafka Streams application must provide SerDes (Serializer/Deserializer) for the data types of record keys and record values (e.g. <code class="docutils literal"><span class="pre">java.lang.String</span></code>) to materialize the data when necessary.  Operations that require such SerDes information include: <code class="docutils literal"><span class="pre">stream()</span></code>, <code class="docutils literal"><span class="pre">table()</span></code>, <code class="docutils lit [...]
     <p>You can provide SerDes by using either of these methods:</p>
     <ul class="simple">
-      <li>By setting default SerDes via a <code class="docutils literal"><span class="pre">StreamsConfig</span></code> instance.</li>
+      <li>By setting default SerDes in the <code class="docutils literal"><span class="pre">java.util.Properties</span></code> config instance.</li>
       <li>By specifying explicit SerDes when calling the appropriate API methods, thus overriding the defaults.</li>
     </ul>
 
@@ -55,7 +55,7 @@
       </ul>
     <div class="section" id="configuring-serdes">
       <h2>Configuring SerDes<a class="headerlink" href="#configuring-serdes" title="Permalink to this headline"></a></h2>
-      <p>SerDes specified in the Streams configuration via <code class="docutils literal"><span class="pre">StreamsConfig</span></code> are used as the default in your Kafka Streams application.</p>
+      <p>SerDes specified in the Streams configuration are used as the default in your Kafka Streams application.</p>
       <div class="highlight-java"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">org.apache.kafka.common.serialization.Serdes</span><span class="o">;</span>
 <span class="kn">import</span> <span class="nn">org.apache.kafka.streams.StreamsConfig</span><span class="o">;</span>
 
@@ -64,8 +64,6 @@
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">KEY_SERDE_CLASS_CONFIG</span><span class="o">,</span> <span class="n">Serdes</span><span class="o">.</span><span class="na">String</span><span class="o">().</span><span class="na">getClass</span><span class="o">().</span><span class="na">getName</span><span class="o">());</span>
 <span class="c1">// Default serde for values of data records (here: built-in serde for Long type)</span>
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">VALUE_SERDE_CLASS_CONFIG</span><span class="o">,</span> <span class="n">Serdes</span><span class="o">.</span><span class="na">Long</span><span class="o">().</span><span class="na">getClass</span><span class="o">().</span><span class="na">getName</span><span class="o">());</span>
-
-<span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="k">new</span> <span class="n">StreamsConfig</span><span class="o">(</span><span class="n">settings</span><span class="o">);</span>
 </pre></div>
       </div>
     </div>
diff --git a/docs/streams/developer-guide/dsl-api.html b/docs/streams/developer-guide/dsl-api.html
index 7c0b736..cd3a965 100644
--- a/docs/streams/developer-guide/dsl-api.html
+++ b/docs/streams/developer-guide/dsl-api.html
@@ -3044,14 +3044,14 @@ t=5 (blue), which lead to a merge of sessions and an extension of a session, res
 
 
 <span class="c1">// Write the stream to the output topic, using the configured default key</span>
-<span class="c1">// and value serdes of your `StreamsConfig`.</span>
+<span class="c1">// and value serdes.</span>
 <span class="n">stream</span><span class="o">.</span><span class="na">to</span><span class="o">(</span><span class="s">&quot;my-stream-output-topic&quot;</span><span class="o">);</span>
 
 <span class="c1">// Same for table</span>
 <span class="n">table</span><span class="o">.</span><span class="na">to</span><span class="o">(</span><span class="s">&quot;my-table-output-topic&quot;</span><span class="o">);</span>
 
 <span class="c1">// Write the stream to the output topic, using explicit key and value serdes,</span>
-<span class="c1">// (thus overriding the defaults of your `StreamsConfig`).</span>
+<span class="c1">// (thus overriding the defaults in the config properties).</span>
 <span class="n">stream</span><span class="o">.</span><span class="na">to</span><span class="o">(</span><span class="s">&quot;my-stream-output-topic&quot;</span><span class="o">,</span> <span class="n">Produced</span><span class="o">.</span><span class="na">with</span><span class="o">(</span><span class="n">Serdes</span><span class="o">.</span><span class="na">String</span><span class="o">(),</span> <span class="n">Serdes</span><span class="o">.</span><span class="na">Long</span><span cla [...]
 </pre></div>
                         </div>
@@ -3131,7 +3131,7 @@ t=5 (blue), which lead to a merge of sessions and an extension of a session, res
             </div>
         </div>
         <div class="section" id="testing-a-streams-app">
-            <a class="headerlink" href="#testing-a-streams-app" title="Permalink to this headline"><h2>Testing a Streams application</a></h2>
+            <a class="headerlink" href="#testing-a-streams-app" title="Permalink to this headline"><h2>Testing a Streams application</h2></a>
             Kafka Streams comes with a <code>test-utils</code> module to help you test your application <a href="testing.html">here</a>.
             </div>
         </div>
@@ -3201,7 +3201,7 @@ import org.apache.kafka.streams.{KafkaStreams, StreamsConfig}
 object WordCountApplication extends App {
   import Serdes._
 
-  val config: Properties = {
+  val props: Properties = {
     val p = new Properties()
     p.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application")
     p.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092")
@@ -3216,7 +3216,7 @@ object WordCountApplication extends App {
     .count(Materialized.as("counts-store"))
   wordCounts.toStream.to("WordsWithCountsTopic")
 
-  val streams: KafkaStreams = new KafkaStreams(builder.build(), config)
+  val streams: KafkaStreams = new KafkaStreams(builder.build(), props)
   streams.start()
 
   sys.ShutdownHookThread {
diff --git a/docs/streams/developer-guide/interactive-queries.html b/docs/streams/developer-guide/interactive-queries.html
index 9b64ddb..051f87c 100644
--- a/docs/streams/developer-guide/interactive-queries.html
+++ b/docs/streams/developer-guide/interactive-queries.html
@@ -129,7 +129,7 @@
                 <span id="streams-developer-guide-interactive-queries-local-key-value-stores"></span><h3><a class="toc-backref" href="#id4">Querying local key-value stores</a><a class="headerlink" href="#querying-local-key-value-stores" title="Permalink to this headline"></a></h3>
                 <p>To query a local key-value store, you must first create a topology with a key-value store. This example creates a key-value
                     store named &#8220;CountsKeyValueStore&#8221;. This store will hold the latest count for any word that is found on the topic &#8220;word-count-input&#8221;.</p>
-                <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="o">...;</span>
+                <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">Properties </span> <span class="n">props</span> <span class="o">=</span> <span class="o">...;</span>
 <span class="n">StreamsBuilder</span> <span class="n">builder</span> <span class="o">=</span> <span class="o">...;</span>
 <span class="n">KStream</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">String</span><span class="o">&gt;</span> <span class="n">textLines</span> <span class="o">=</span> <span class="o">...;</span>
 
@@ -142,7 +142,7 @@
 <span class="n">groupedByWord</span><span class="o">.</span><span class="na">count</span><span class="o">(</span><span class="n">Materialized</span><span class="o">.&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">String</span><span class="o">,</span> <span class="n">KeyValueStore</span><span class="o">&lt;</span><span class="n">Bytes</span><span class="o">,</span> <span class="kt">byte</span><span class="o">[]&gt;</span><span class="n">as</span><span clas [...]
 
 <span class="c1">// Start an instance of the topology</span>
-<span class="n">KafkaStreams</span> <span class="n">streams</span> <span class="o">=</span> <span class="k">new</span> <span class="n">KafkaStreams</span><span class="o">(</span><span class="n">builder</span><span class="o">,</span> <span class="n">config</span><span class="o">);</span>
+<span class="n">KafkaStreams</span> <span class="n">streams</span> <span class="o">=</span> <span class="k">new</span> <span class="n">KafkaStreams</span><span class="o">(</span><span class="n">builder</span><span class="o">,</span> <span class="n">props</span><span class="o">);</span>
 <span class="n">streams</span><span class="o">.</span><span class="na">start</span><span class="o">();</span>
 </pre></div>
                 </div>
@@ -171,7 +171,7 @@
                 </div>
                 <p>You can also materialize the results of stateless operators by using the overloaded methods that take a <code class="docutils literal"><span class="pre">queryableStoreName</span></code>
                     as shown in the example below:</p>
-                <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="o">...;</span>
+                <div class="highlight-java"><div class="highlight"><pre><span></span>
 <span class="n">StreamsBuilder</span> <span class="n">builder</span> <span class="o">=</span> <span class="o">...;</span>
 <span class="n">KTable</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Integer</span><span class="o">&gt;</span> <span class="n">regionCounts</span> <span class="o">=</span> <span class="o">...;</span>
 
@@ -192,7 +192,7 @@
                     However, there is only one result per window for a given key.</p>
                 <p>To query a local window store, you must first create a topology with a window store. This example creates a window store
                     named &#8220;CountsWindowStore&#8221; that contains the counts for words in 1-minute windows.</p>
-                <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="o">...;</span>
+                <div class="highlight-java"><div class="highlight"><pre><span></span>
 <span class="n">StreamsBuilder</span> <span class="n">builder</span> <span class="o">=</span> <span class="o">...;</span>
 <span class="n">KStream</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">String</span><span class="o">&gt;</span> <span class="n">textLines</span> <span class="o">=</span> <span class="o">...;</span>
 
@@ -316,7 +316,7 @@
 </pre></div>
                 </div>
                 <p>You can now find and query your custom store:</p>
-                <div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="o">...;</span>
+                <div class="highlight-java"><div class="highlight"><pre><span></span>
 <span class="n">Topology</span> <span class="n">topology</span> <span class="o">=</span> <span class="o">...;</span>
 <span class="n">ProcessorSupplier</span> <span class="n">processorSuppler</span> <span class="o">=</span> <span class="o">...;</span>
 
@@ -368,7 +368,7 @@ interactive queries</span></p>
             </div>
             <div class="section" id="exposing-the-rpc-endpoints-of-your-application">
                 <span id="streams-developer-guide-interactive-queries-expose-rpc"></span><h3><a class="toc-backref" href="#id9">Exposing the RPC endpoints of your application</a><a class="headerlink" href="#exposing-the-rpc-endpoints-of-your-application" title="Permalink to this headline"></a></h3>
-                <p>To enable remote state store discovery in a distributed Kafka Streams application, you must set the <a class="reference internal" href="config-streams.html#streams-developer-guide-required-configs"><span class="std std-ref">configuration property</span></a> in <code class="docutils literal"><span class="pre">StreamsConfig</span></code>.
+                <p>To enable remote state store discovery in a distributed Kafka Streams application, you must set the <a class="reference internal" href="config-streams.html#streams-developer-guide-required-configs"><span class="std std-ref">configuration property</span></a> in the config properties.
                     The <code class="docutils literal"><span class="pre">application.server</span></code> property defines a unique <code class="docutils literal"><span class="pre">host:port</span></code> pair that points to the RPC endpoint of the respective instance of a Kafka Streams application.
                     The value of this configuration property will vary across the instances of your application.
                     When this property is set, Kafka Streams will keep track of the RPC endpoint information for every instance of an application, its state stores, and assigned stream partitions through instances of <a class="reference external" href="../../../javadoc/org/apache/kafka/streams/state/StreamsMetadata.html">StreamsMetadata</a>.</p>
@@ -386,7 +386,6 @@ interactive queries</span></p>
 <span class="n">props</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">APPLICATION_SERVER_CONFIG</span><span class="o">,</span> <span class="n">rpcEndpoint</span><span class="o">);</span>
 <span class="c1">// ... further settings may follow here ...</span>
 
-<span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="k">new</span> <span class="n">StreamsConfig</span><span class="o">(</span><span class="n">props</span><span class="o">);</span>
 <span class="n">StreamsBuilder</span> <span class="n">builder</span> <span class="o">=</span> <span class="k">new</span> <span class="n">StreamsBuilder</span><span class="o">();</span>
 
 <span class="n">KStream</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">String</span><span class="o">&gt;</span> <span class="n">textLines</span> <span class="o">=</span> <span class="n">builder</span><span class="o">.</span><span class="na">stream</span><span class="o">(</span><span class="n">stringSerde</span><span class="o">,</span> <span class="n">stringSerde</span><span class="o">,</span> <span class="s">&quot;word-count-input&q [...]
@@ -400,7 +399,7 @@ interactive queries</span></p>
 <span class="n">groupedByWord</span><span class="o">.</span><span class="na">count</span><span class="o">(</span><span class="n">Materialized</span><span class="o">.&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Long</span><span class="o">,</span> <span class="n">KeyValueStore</span><span class="o">&lt;</span><span class="n">Bytes</span><span class="o">,</span> <span class="kt">byte</span><span class="o">[]&gt;</span><span class="n">as</span><span class= [...]
 
 <span class="c1">// Start an instance of the topology</span>
-<span class="n">KafkaStreams</span> <span class="n">streams</span> <span class="o">=</span> <span class="k">new</span> <span class="n">KafkaStreams</span><span class="o">(</span><span class="n">builder</span><span class="o">,</span> <span class="n">streamsConfiguration</span><span class="o">);</span>
+<span class="n">KafkaStreams</span> <span class="n">streams</span> <span class="o">=</span> <span class="k">new</span> <span class="n">KafkaStreams</span><span class="o">(</span><span class="n">builder</span><span class="o">,</span> <span class="n">props</span><span class="o">);</span>
 <span class="n">streams</span><span class="o">.</span><span class="na">start</span><span class="o">();</span>
 
 <span class="c1">// Then, create and start the actual RPC service for remote access to this</span>
diff --git a/docs/streams/developer-guide/memory-mgmt.html b/docs/streams/developer-guide/memory-mgmt.html
index a73a814..7ae2060 100644
--- a/docs/streams/developer-guide/memory-mgmt.html
+++ b/docs/streams/developer-guide/memory-mgmt.html
@@ -80,8 +80,8 @@
       <p>The cache size is specified through the <code class="docutils literal"><span class="pre">cache.max.bytes.buffering</span></code> parameter, which is a global setting per
         processing topology:</p>
       <div class="highlight-java"><div class="highlight"><pre><span></span><span class="c1">// Enable record cache of size 10 MB.</span>
-<span class="n">Properties</span> <span class="n">streamsConfiguration</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
-<span class="n">streamsConfiguration</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">CACHE_MAX_BYTES_BUFFERING_CONFIG</span><span class="o">,</span> <span class="mi">10</span> <span class="o">*</span> <span class="mi">1024</span> <span class="o">*</span> <span class="mi">1024L</span><span class="o">);</span>
+<span class="n">Properties</span> <span class="n">props</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
+<span class="n">props</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">CACHE_MAX_BYTES_BUFFERING_CONFIG</span><span class="o">,</span> <span class="mi">10</span> <span class="o">*</span> <span class="mi">1024</span> <span class="o">*</span> <span class="mi">1024L</span><span class="o">);</span>
 </pre></div>
       </div>
       <p>This parameter controls the number of bytes allocated for caching. Specifically, for a processor topology instance with
@@ -105,8 +105,8 @@
         <li><p class="first">To turn off caching the cache size can be set to zero:</p>
           <blockquote>
             <div><div class="highlight-java"><div class="highlight"><pre><span></span><span class="c1">// Disable record cache</span>
-<span class="n">Properties</span> <span class="n">streamsConfiguration</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
-<span class="n">streamsConfiguration</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">CACHE_MAX_BYTES_BUFFERING_CONFIG</span><span class="o">,</span> <span class="mi">0</span><span class="o">);</span>
+<span class="n">Properties</span> <span class="n">props</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
+<span class="n">props</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">CACHE_MAX_BYTES_BUFFERING_CONFIG</span><span class="o">,</span> <span class="mi">0</span><span class="o">);</span>
 </pre></div>
             </div>
               <p>Turning off caching might result in high write traffic for the underlying RocksDB store.
@@ -118,11 +118,11 @@
         </li>
         <li><p class="first">To enable caching but still have an upper bound on how long records will be cached, you can set the commit interval. In this example, it is set to 1000 milliseconds:</p>
           <blockquote>
-            <div><div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">Properties</span> <span class="n">streamsConfiguration</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
+            <div><div class="highlight-java"><div class="highlight"><pre><span></span><span class="n">Properties</span> <span class="n">props</span> <span class="o">=</span> <span class="k">new</span> <span class="n">Properties</span><span class="o">();</span>
 <span class="c1">// Enable record cache of size 10 MB.</span>
-<span class="n">streamsConfiguration</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">CACHE_MAX_BYTES_BUFFERING_CONFIG</span><span class="o">,</span> <span class="mi">10</span> <span class="o">*</span> <span class="mi">1024</span> <span class="o">*</span> <span class="mi">1024L</span><span class="o">);</span>
+<span class="n">props</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">CACHE_MAX_BYTES_BUFFERING_CONFIG</span><span class="o">,</span> <span class="mi">10</span> <span class="o">*</span> <span class="mi">1024</span> <span class="o">*</span> <span class="mi">1024L</span><span class="o">);</span>
 <span class="c1">// Set commit interval to 1 second.</span>
-<span class="n">streamsConfiguration</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">COMMIT_INTERVAL_MS_CONFIG</span><span class="o">,</span> <span class="mi">1000</span><span class="o">);</span>
+<span class="n">props</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">StreamsConfig</span><span class="o">.</span><span class="na">COMMIT_INTERVAL_MS_CONFIG</span><span class="o">,</span> <span class="mi">1000</span><span class="o">);</span>
 </pre></div>
             </div>
             </div></blockquote>
diff --git a/docs/streams/developer-guide/security.html b/docs/streams/developer-guide/security.html
index 0604747..9c49456 100644
--- a/docs/streams/developer-guide/security.html
+++ b/docs/streams/developer-guide/security.html
@@ -95,7 +95,7 @@ ssl.keystore.password<span class="o">=</span>test1234
 ssl.key.password<span class="o">=</span>test1234
 </pre></div>
             </div>
-            <p>Configure these settings in the application for your <code class="docutils literal"><span class="pre">StreamsConfig</span></code> instance. These settings will encrypt any
+            <p>Configure these settings in the application for your <code class="docutils literal"><span class="pre">Properties</span></code> instance. These settings will encrypt any
                 data-in-transit that is being read from or written to Kafka, and your application will authenticate itself against the
                 Kafka brokers that it is communicating with. Note that this example does not cover client authorization.</p>
             <div class="highlight-java"><div class="highlight"><pre><span></span><span class="c1">// Code of your Java application that uses the Kafka Streams library</span>
@@ -115,7 +115,6 @@ ssl.key.password<span class="o">=</span>test1234
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">SslConfigs</span><span class="o">.</span><span class="na">SSL_KEYSTORE_LOCATION_CONFIG</span><span class="o">,</span> <span class="s">&quot;/etc/security/tls/kafka.client.keystore.jks&quot;</span><span class="o">);</span>
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">SslConfigs</span><span class="o">.</span><span class="na">SSL_KEYSTORE_PASSWORD_CONFIG</span><span class="o">,</span> <span class="s">&quot;test1234&quot;</span><span class="o">);</span>
 <span class="n">settings</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="n">SslConfigs</span><span class="o">.</span><span class="na">SSL_KEY_PASSWORD_CONFIG</span><span class="o">,</span> <span class="s">&quot;test1234&quot;</span><span class="o">);</span>
-<span class="n">StreamsConfig</span> <span class="n">streamsConfiguration</span> <span class="o">=</span> <span class="k">new</span> <span class="n">StreamsConfig</span><span class="o">(</span><span class="n">settings</span><span class="o">);</span>
 </pre></div>
             </div>
             <p>If you incorrectly configure a security setting in your application, it will fail at runtime, typically right after you
diff --git a/docs/streams/developer-guide/testing.html b/docs/streams/developer-guide/testing.html
index ea2ae98..92d8fce 100644
--- a/docs/streams/developer-guide/testing.html
+++ b/docs/streams/developer-guide/testing.html
@@ -86,10 +86,10 @@ builder.stream("input-topic").filter(...).to("output-topic");
 Topology topology = builder.build();
 
 // setup test driver
-Properties config = new Properties();
-config.put(StreamsConfig.APPLICATION_ID_CONFIG, "test");
-config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "dummy:1234");
-TopologyTestDriver testDriver = new TopologyTestDriver(topology, config);
+Properties props = new Properties();
+props.put(StreamsConfig.APPLICATION_ID_CONFIG, "test");
+props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "dummy:1234");
+TopologyTestDriver testDriver = new TopologyTestDriver(topology, props);
         </pre>
             <p>
                 The test driver accepts <code>ConsumerRecord</code>s with key and value type <code>byte[]</code>.
@@ -171,12 +171,12 @@ public void setup() {
     topology.addSink("sinkProcessor", "result-topic", "aggregator");
 
     // setup test driver
-    Properties config = new Properties();
-    config.setProperty(StreamsConfig.APPLICATION_ID_CONFIG, "maxAggregation");
-    config.setProperty(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "dummy:1234");
-    config.setProperty(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
-    config.setProperty(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.Long().getClass().getName());
-    testDriver = new TopologyTestDriver(topology, config);
+    Properties props = new Properties();
+    props.setProperty(StreamsConfig.APPLICATION_ID_CONFIG, "maxAggregation");
+    props.setProperty(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "dummy:1234");
+    props.setProperty(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
+    props.setProperty(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.Long().getClass().getName());
+    testDriver = new TopologyTestDriver(topology, props);
 
     // pre-populate store
     store = testDriver.getKeyValueStore("aggStore");
@@ -318,13 +318,13 @@ processorUnderTest.init(context);
             If you need to pass configuration to your processor or set the default serdes, you can create the mock with
             config:
             <pre>
-final Properties config = new Properties();
-config.put(StreamsConfig.APPLICATION_ID_CONFIG, "unit-test");
-config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "");
-config.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
-config.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.Long().getClass());
-config.put("some.other.config", "some config value");
-final MockProcessorContext context = new MockProcessorContext(config);
+final Properties props = new Properties();
+props.put(StreamsConfig.APPLICATION_ID_CONFIG, "unit-test");
+props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "");
+props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
+props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.Long().getClass());
+props.put("some.other.config", "some config value");
+final MockProcessorContext context = new MockProcessorContext(props);
                 </pre>
             </p>
             <b>Captured data</b>
diff --git a/docs/streams/developer-guide/write-streams.html b/docs/streams/developer-guide/write-streams.html
index 0007b3e..145eb30 100644
--- a/docs/streams/developer-guide/write-streams.html
+++ b/docs/streams/developer-guide/write-streams.html
@@ -119,11 +119,10 @@
         <li>The first argument of the <code class="docutils literal"><span class="pre">KafkaStreams</span></code> constructor takes a topology (either <code class="docutils literal"><span class="pre">StreamsBuilder#build()</span></code> for the
           <a class="reference internal" href="dsl-api.html#streams-developer-guide-dsl"><span class="std std-ref">DSL</span></a> or <code class="docutils literal"><span class="pre">Topology</span></code> for the
           <a class="reference internal" href="processor-api.html#streams-developer-guide-processor-api"><span class="std std-ref">Processor API</span></a>) that is used to define a topology.</li>
-        <li>The second argument is an instance of <code class="docutils literal"><span class="pre">StreamsConfig</span></code>, which defines the configuration for this specific topology.</li>
+        <li>The second argument is an instance of <code class="docutils literal"><span class="pre">java.util.Properties</span></code>, which defines the configuration for this specific topology.</li>
       </ul>
       <p>Code example:</p>
       <div class="highlight-java"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">org.apache.kafka.streams.KafkaStreams</span><span class="o">;</span>
-<span class="kn">import</span> <span class="nn">org.apache.kafka.streams.StreamsConfig</span><span class="o">;</span>
 <span class="kn">import</span> <span class="nn">org.apache.kafka.streams.kstream.StreamsBuilder</span><span class="o">;</span>
 <span class="kn">import</span> <span class="nn">org.apache.kafka.streams.processor.Topology</span><span class="o">;</span>
 
@@ -142,9 +141,9 @@
 <span class="c1">// Use the configuration to tell your application where the Kafka cluster is,</span>
 <span class="c1">// which Serializers/Deserializers to use by default, to specify security settings,</span>
 <span class="c1">// and so on.</span>
-<span class="n">StreamsConfig</span> <span class="n">config</span> <span class="o">=</span> <span class="o">...;</span>
+<span class="n">Properties</span> <span class="n">props</span> <span class="o">=</span> <span class="o">...;</span>
 
-<span class="n">KafkaStreams</span> <span class="n">streams</span> <span class="o">=</span> <span class="k">new</span> <span class="n">KafkaStreams</span><span class="o">(</span><span class="n">topology</span><span class="o">,</span> <span class="n">config</span><span class="o">);</span>
+<span class="n">KafkaStreams</span> <span class="n">streams</span> <span class="o">=</span> <span class="k">new</span> <span class="n">KafkaStreams</span><span class="o">(</span><span class="n">topology</span><span class="o">,</span> <span class="n">props</span><span class="o">);</span>
 </pre></div>
       </div>
       <p>At this point, internal structures are initialized, but the processing is not started yet.
diff --git a/docs/streams/index.html b/docs/streams/index.html
index 6dfaf6b..193a7b2 100644
--- a/docs/streams/index.html
+++ b/docs/streams/index.html
@@ -172,11 +172,11 @@
                    public class WordCountApplication {
 
                        public static void main(final String[] args) throws Exception {
-                           Properties config = new Properties();
-                           config.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application");
-                           config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092");
-                           config.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
-                           config.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
+                           Properties props = new Properties();
+                           props.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application");
+                           props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092");
+                           props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
+                           props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
 
                            StreamsBuilder builder = new StreamsBuilder();
                            KStream&lt;String, String&gt; textLines = builder.stream("TextLinesTopic");
@@ -186,7 +186,7 @@
                                .count(Materialized.&lt;String, Long, KeyValueStore&lt;Bytes, byte[]&gt;&gt;as("counts-store"));
                            wordCounts.toStream().to("WordsWithCountsTopic", Produced.with(Serdes.String(), Serdes.Long()));
 
-                           KafkaStreams streams = new KafkaStreams(builder.build(), config);
+                           KafkaStreams streams = new KafkaStreams(builder.build(), props);
                            streams.start();
                        }
 
@@ -215,11 +215,11 @@
                    public class WordCountApplication {
 
                        public static void main(final String[] args) throws Exception {
-                           Properties config = new Properties();
-                           config.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application");
-                           config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092");
-                           config.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
-                           config.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
+                           Properties props = new Properties();
+                           props.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application");
+                           props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092");
+                           props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
+                           props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
 
                            StreamsBuilder builder = new StreamsBuilder();
                            KStream&lt;String, String&gt; textLines = builder.stream("TextLinesTopic");
@@ -241,7 +241,7 @@
 
                            wordCounts.toStream().to("WordsWithCountsTopic", Produced.with(Serdes.String(), Serdes.Long()));
 
-                           KafkaStreams streams = new KafkaStreams(builder.build(), config);
+                           KafkaStreams streams = new KafkaStreams(builder.build(), props);
                            streams.start();
                        }
 
@@ -263,7 +263,7 @@ import org.apache.kafka.streams.{KafkaStreams, StreamsConfig}
 object WordCountApplication extends App {
   import Serdes._
 
-  val config: Properties = {
+  val props: Properties = {
     val p = new Properties()
     p.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application")
     p.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092")
@@ -278,7 +278,7 @@ object WordCountApplication extends App {
     .count(Materialized.as("counts-store"))
   wordCounts.toStream.to("WordsWithCountsTopic")
 
-  val streams: KafkaStreams = new KafkaStreams(builder.build(), config)
+  val streams: KafkaStreams = new KafkaStreams(builder.build(), props)
   streams.start()
 
   sys.ShutdownHookThread {
diff --git a/docs/streams/tutorial.html b/docs/streams/tutorial.html
index 21ff030..0006e3e 100644
--- a/docs/streams/tutorial.html
+++ b/docs/streams/tutorial.html
@@ -208,9 +208,7 @@
     <p>
         Note that we can always describe the topology as we did above at any given point while we are building it in the code, so as a user you can interactively "try and taste" your computational logic defined in the topology until you are happy with it.
         Suppose we are already done with this simple topology that just pipes data from one Kafka topic to another in an endless streaming manner,
-        we can now construct the Streams client with the two components we have just constructed above: the configuration map and the topology object
-        (one can also construct a <code>StreamsConfig</code> object from the <code>props</code> map and then pass that object to the constructor,
-        <code>KafkaStreams</code> have overloaded constructor functions to takes either type).
+        we can now construct the Streams client with the two components we have just constructed above: the configuration map specified in a <code>java.util.Properties</code> instance and the <code>Topology</code> object.
     </p>
 
     <pre class="brush: java;">
diff --git a/streams/src/main/java/org/apache/kafka/streams/KafkaStreams.java b/streams/src/main/java/org/apache/kafka/streams/KafkaStreams.java
index e109345..d6002ff 100644
--- a/streams/src/main/java/org/apache/kafka/streams/KafkaStreams.java
+++ b/streams/src/main/java/org/apache/kafka/streams/KafkaStreams.java
@@ -109,12 +109,11 @@ import static org.apache.kafka.common.utils.Utils.getPort;
  * props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
  * props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
  * props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
- * StreamsConfig config = new StreamsConfig(props);
  *
  * StreamsBuilder builder = new StreamsBuilder();
  * builder.<String, String>stream("my-input-topic").mapValues(value -> value.length().toString()).to("my-output-topic");
  *
- * KafkaStreams streams = new KafkaStreams(builder.build(), config);
+ * KafkaStreams streams = new KafkaStreams(builder.build(), props);
  * streams.start();
  * }</pre>
  *

-- 
To stop receiving notification emails like this one, please contact
guozhang@apache.org.

Mime
View raw message