kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From guozh...@apache.org
Subject [3/3] kafka-site git commit: Port changes from PR4017 and PR3862 to 0110
Date Thu, 12 Oct 2017 19:09:16 GMT
Port changes from PR4017 and PR3862 to 0110

Port changes from https://github.com/apache/kafka/pull/4017 and https://github.com/apache/kafka/pull/3862
to 0110.

guozhangwang

Author: Joel Hamill <joel@Joel-Hamill-Confluent.local>
Author: Joel Hamill <git config --global user.email>

Reviewers: Guozhang Wang <wangguoz@gmail.com>

Closes #97 from joel-hamill/joel-hamill/0110-docs


Project: http://git-wip-us.apache.org/repos/asf/kafka-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/kafka-site/commit/97a78e3b
Tree: http://git-wip-us.apache.org/repos/asf/kafka-site/tree/97a78e3b
Diff: http://git-wip-us.apache.org/repos/asf/kafka-site/diff/97a78e3b

Branch: refs/heads/asf-site
Commit: 97a78e3be4d9aef2fd3ebed1189346b20ffb8202
Parents: 33a9de2
Author: Joel Hamill <joel@Joel-Hamill-Confluent.local>
Authored: Thu Oct 12 12:09:10 2017 -0700
Committer: Guozhang Wang <wangguoz@gmail.com>
Committed: Thu Oct 12 12:09:10 2017 -0700

----------------------------------------------------------------------
 0110/introduction.html            |    2 +-
 0110/streams/developer-guide.html | 2046 +++++++++++++++++++++++++++++---
 0110/streams/quickstart.html      |    6 +-
 0110/streams/tutorial.html        |    2 +-
 0110/streams/upgrade-guide.html   |    2 +-
 5 files changed, 1880 insertions(+), 178 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/kafka-site/blob/97a78e3b/0110/introduction.html
----------------------------------------------------------------------
diff --git a/0110/introduction.html b/0110/introduction.html
index e3809fc..5b3bb4a 100644
--- a/0110/introduction.html
+++ b/0110/introduction.html
@@ -202,7 +202,7 @@
   Likewise for streaming data pipelines the combination of subscription to real-time events
make it possible to use Kafka for very low-latency pipelines; but the ability to store data
reliably make it possible to use it for critical data where the delivery of data must be guaranteed
or for integration with offline systems that load data only periodically or may go down for
extended periods of time for maintenance. The stream processing facilities make it possible
to transform data as it arrives.
   </p>
   <p>
-  For more information on the guarantees, apis, and capabilities Kafka provides see the rest
of the <a href="/documentation.html">documentation</a>.
+  For more information on the guarantees, APIs, and capabilities Kafka provides see the rest
of the <a href="/documentation.html">documentation</a>.
   </p>
 </script>
 


Mime
View raw message