kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From guozh...@apache.org
Subject [kafka] branch trunk updated: MINOR: add headers support in new api (#5252)
Date Tue, 19 Jun 2018 21:23:14 GMT
This is an automated email from the ASF dual-hosted git repository.

guozhang pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/kafka.git


The following commit(s) were added to refs/heads/trunk by this push:
     new f8cb3c3  MINOR: add headers support in new api (#5252)
f8cb3c3 is described below

commit f8cb3c3e1d16d34e5cc382cf4833330f9cce9aad
Author: Guozhang Wang <wangguoz@gmail.com>
AuthorDate: Tue Jun 19 14:23:04 2018 -0700

    MINOR: add headers support in new api (#5252)
    
    Reviewers: Matthias J. Sax <matthias@confluent.io>
---
 docs/streams/upgrade-guide.html | 11 +++++++++--
 1 file changed, 9 insertions(+), 2 deletions(-)

diff --git a/docs/streams/upgrade-guide.html b/docs/streams/upgrade-guide.html
index cd92782..a74eeb8 100644
--- a/docs/streams/upgrade-guide.html
+++ b/docs/streams/upgrade-guide.html
@@ -135,6 +135,12 @@
         For more details, see <a href="https://cwiki.apache.org/confluence/display/KAFKA/KIP-265%3A+Make+Windowed+Serde+to+public+APIs">KIP-265</a>.
     </p>
     <p>
+        We've added message header support in the <code>Processor API</code>
in Kafka 2.0.0. In particular, we have added a new API <code>ProcessorContext#headers()</code>
+        which returns a <code>Headers</code> object that keeps track of the headers
of the source topic's message that is being processed. Through this object, users can manipulate
+        the headers map that is being propagated throughout the processor topology as well.
For more details please feel free to read
+        the <a href="/{{version}}/documentation/streams/developer-guide/processor-api.html#accessing-processor-context">Developer
Guide</a> section.
+    </p>
+    <p>
         We have deprecated constructors of <code>KafkaStreams</code> that take
a <code>StreamsConfig</code> as parameter.
         Please use the other corresponding constructors that accept <code>java.util.Properties</code>
instead.
         For more details, see <a href="https://cwiki.apache.org/confluence/display/KAFKA/KIP-245%3A+Use+Properties+instead+of+StreamsConfig+in+KafkaStreams+constructor">KIP-245</a>.
@@ -163,7 +169,8 @@
     </p>
     <p>
         Kafka Streams DSL for Scala is a new Kafka Streams client library available for developers
authoring Kafka Streams applications in Scala.  It wraps core Kafka Streams DSL types to make
it easier to call when
-        interoperating with Scala code.  For example, it includes higher order functions
as parameters for transformations avoiding the need anonymous classes in Java 7 or experimental
SAM type conversions in Scala 2.11, automatic conversion between Java and Scala collection
types, a way
+        interoperating with Scala code.  For example, it includes higher order functions
as parameters for transformations avoiding the need anonymous classes in Java 7 or experimental
SAM type conversions in Scala 2.11,
+        automatic conversion between Java and Scala collection types, a way
         to implicitly provide SerDes to reduce boilerplate from your application and make
it more typesafe, and more!  For more information see the
         <a href="/{{version}}/documentation/streams/developer-guide/dsl-api.html#scala-dsl">Kafka
Streams DSL for Scala documentation</a> and
         <a href="https://cwiki.apache.org/confluence/display/KAFKA/KIP-270+-+A+Scala+Wrapper+Library+for+Kafka+Streams">KIP-270</a>.
@@ -181,7 +188,7 @@
             For detailed guidance on how to update your code please read <a href="#streams_api_changes_100">here</a></li>
         <li><code>KStream, KTable, KGroupedStream</code> overloaded functions
that requires serde and other specifications explicitly are removed and replaced with simpler
overloaded functions that use <code>Consumed, Produced, Serialized, Materialized, Joined</code>
(they are deprecated since 1.0.0).
             For detailed guidance on how to update your code please read <a href="#streams_api_changes_100">here</a></li>
-        <li><code>Processor#punctuate</code>, <code>ValueTransformer#punctuate</code>,
<code>ValueTransformer#punctuate</code> and <code>RecordContext#schedule(long)</code>
are removed and replaced by <code>RecordContext#schedule(long, PunctuationType, Punctuator)</code>
(they are deprecated in 1.0.0). </li>
+        <li><code>Processor#punctuate</code>, <code>ValueTransformer#punctuate</code>,
<code>ValueTransformer#punctuate</code> and <code>ProcessorContext#schedule(long)</code>
are removed and replaced by <code>ProcessorContext#schedule(long, PunctuationType, Punctuator)</code>
(they are deprecated in 1.0.0). </li>
         <li>The second <code>boolean</code> typed parameter "loggingEnabled"
in <code>ProcessorContext#register</code> has been removed; users can now use
<code>StoreBuilder#withLoggingEnabled, withLoggingDisabled</code> to specify the
behavior when they create the state store. </li>
         <li><code>KTable#writeAs, print, foreach, to, through</code> are
removed, users can call <code>KTable#tostream()#writeAs</code> instead for the
same purpose (they are deprecated since 0.11.0.0).
             For detailed list of removed APIs please read <a href="#streams_api_changes_0110">here</a></li>


Mime
View raw message