kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From guozh...@apache.org
Subject kafka git commit: KAFKA-4112: Remove alpha quality label from Kafka Streams in docs
Date Thu, 01 Sep 2016 19:21:58 GMT
Repository: kafka
Updated Branches:
  refs/heads/trunk 3a161db57 -> 268cff704

KAFKA-4112: Remove alpha quality label from Kafka Streams in docs

Rephrase 'alpha quality' wording in Streams section of api.html.
Couple of other minor fixes in streams.html

Author: Damian Guy <damian.guy@gmail.com>

Reviewers: Guozhang Wang, Ismael Juma, Michael G. Noll

Closes #1811 from dguy/kstreams-312

Project: http://git-wip-us.apache.org/repos/asf/kafka/repo
Commit: http://git-wip-us.apache.org/repos/asf/kafka/commit/268cff70
Tree: http://git-wip-us.apache.org/repos/asf/kafka/tree/268cff70
Diff: http://git-wip-us.apache.org/repos/asf/kafka/diff/268cff70

Branch: refs/heads/trunk
Commit: 268cff704994693e177c38cd2a16bf5179765f33
Parents: 3a161db
Author: Damian Guy <damian.guy@gmail.com>
Authored: Thu Sep 1 12:21:55 2016 -0700
Committer: Guozhang Wang <wangguoz@gmail.com>
Committed: Thu Sep 1 12:21:55 2016 -0700

 docs/api.html     | 15 ++++++++-------
 docs/streams.html |  4 ++--
 2 files changed, 10 insertions(+), 9 deletions(-)

diff --git a/docs/api.html b/docs/api.html
index c457241..2cb9d86 100644
--- a/docs/api.html
+++ b/docs/api.html
@@ -168,19 +168,20 @@ Examples showing how to use the consumer are given in the
 <h3><a id="streamsapi" href="#streamsapi">2.3 Streams API</a></h3>
-As of the 0.10.0 release we have added a new client library named <b>Kafka Streams</b>
to let users implement their stream processing
-applications with data stored in Kafka topics. Kafka Streams is considered alpha quality
and its public APIs are likely to change in
-future releases.
-You can use Kafka Streams by adding a dependency on the streams jar using
-the following example maven co-ordinates (you can change the version numbers with new releases):
+As of the 0.10.0 release we have added a stream processing engine to Apache Kafka called
Kafka Streams, which is a client library that lets users implement their own stream processing
applications for data stored in Kafka topics.
+You can use Kafka Streams from within your Java applications by adding a dependency on the
kafka-streams jar using the following maven co-ordinates:
-	    &lt;version&gt;;/version&gt;
+	    &lt;version&gt;;/version&gt;
 Examples showing how to use this library are given in the
-<a href="http://kafka.apache.org/0100/javadoc/index.html?org/apache/kafka/streams/KafkaStreams.html"
title="Kafka 0.10.0 Javadoc">javadocs</a> (note those classes annotated with <b>@InterfaceStability.Unstable</b>,
indicating their public APIs may change without backward-compatibility in future releases).
\ No newline at end of file
+<a href="http://kafka.apache.org/0100/javadoc/index.html?org/apache/kafka/streams/KafkaStreams.html"
title="Kafka 0.10.0 Javadoc">javadocs</a> and <a href="streams.html" title="Kafka
Streams Overview">kafka streams overview</a>.
+    Please note that Kafka Streams is a new component of Kafka, and its public APIs may change
in future releases.
+    We use the <b>@InterfaceStability.Unstable</b> annotation to denote classes
whose APIs may change without backward-compatibility in future releases.
\ No newline at end of file

diff --git a/docs/streams.html b/docs/streams.html
index cd2cd93..9c21ec4 100644
--- a/docs/streams.html
+++ b/docs/streams.html
@@ -260,7 +260,7 @@ from a single topic).
     KStreamBuilder builder = new KStreamBuilder();
     KStream<String, GenericRecord> source1 = builder.stream("topic1", "topic2");
-    KTable<String, GenericRecord> source2 = builder.table("topic3");
+    KTable<String, GenericRecord> source2 = builder.table("topic3", "stateStoreName");
 <h5><a id="streams_dsl_transform" href="#streams_dsl_transform">Transform a stream</a></h5>
@@ -298,7 +298,7 @@ based on them.
     // written in Java 8+, using lambda expressions
-    KTable<Windowed<String>, Long> counts = source1.groupBykey().aggregate(
+    KTable<Windowed<String>, Long> counts = source1.groupByKey().aggregate(
         () -> 0L,  // initial value
         (aggKey, value, aggregate) -> aggregate + 1L,   // aggregating value
         TimeWindows.of("counts", 5000L).advanceBy(1000L), // intervals in milliseconds

View raw message