kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mj...@apache.org
Subject [kafka] branch 1.0 updated: MINOR: update release script and fix docs (#5347)
Date Mon, 16 Jul 2018 23:11:25 GMT
This is an automated email from the ASF dual-hosted git repository.

mjsax pushed a commit to branch 1.0
in repository https://gitbox.apache.org/repos/asf/kafka.git


The following commit(s) were added to refs/heads/1.0 by this push:
     new f25e20d  MINOR: update release script and fix docs (#5347)
f25e20d is described below

commit f25e20d5c2a0b73b091722a3eda7036b00ebfbb7
Author: Matthias J. Sax <mjsax@apache.org>
AuthorDate: Mon Jul 16 16:11:20 2018 -0700

    MINOR: update release script and fix docs (#5347)
    
    Reviewer: Guozhang Wang <guozhang@confluent.io>
---
 docs/streams/tutorial.html |   2 +-
 docs/upgrade.html          |   4 +-
 release.py                 | 186 +++++++++++++++++++++++++++++++++++++--------
 3 files changed, 157 insertions(+), 35 deletions(-)

diff --git a/docs/streams/tutorial.html b/docs/streams/tutorial.html
index 11162e8..0bad0e5 100644
--- a/docs/streams/tutorial.html
+++ b/docs/streams/tutorial.html
@@ -532,7 +532,7 @@
               .groupBy((key, value) -> value)
               .count(Materialized.&lt;String, Long, KeyValueStore&lt;Bytes, byte[]&gt;&gt;as("counts-store"))
               .toStream()
-              .to("streams-wordcount-output", Produced.with(Serdes.String(), Serdes.Long());
+              .to("streams-wordcount-output", Produced.with(Serdes.String(), Serdes.Long()));
     </pre>
 
     <p>
diff --git a/docs/upgrade.html b/docs/upgrade.html
index 33b94d4..2d53cd5 100644
--- a/docs/upgrade.html
+++ b/docs/upgrade.html
@@ -129,7 +129,7 @@
          be used if the SaslHandshake request version is greater than 0. </li>
 </ul>
 
-<h5><a id="upgrade_100_streams" href="#upgrade_100_streams">Upgrading a 0.11.0
Kafka Streams Application</a></h5>
+<h5><a id="upgrade_100_streams_from_0110" href="#upgrade_100_streams_from_0110">Upgrading
a 0.11.0 Kafka Streams Application</a></h5>
 <ul>
     <li> Upgrading your Streams application from 0.11.0 to 1.0.0 does not require a
broker upgrade.
          A Kafka Streams 1.0.0 application can connect to 0.11.0, 0.10.2 and 0.10.1 brokers
(it is not possible to connect to 0.10.0 brokers though).
@@ -554,6 +554,7 @@ only support 0.10.1.x or later brokers while 0.10.1.x brokers also support
older
     <li> Upgrading your Streams application from 0.10.0 to 0.10.1 does require a <a
href="#upgrade_10_1">broker upgrade</a> because a Kafka Streams 0.10.1 application
can only connect to 0.10.1 brokers. </li>
     <li> There are couple of API changes, that are not backward compatible (cf. <a
href="/{{version}}/documentation/streams/upgrade-guide#streams_api_changes_0101">Streams
API changes in 0.10.1</a> for more details).
          Thus, you need to update and recompile your code. Just swapping the Kafka Streams
library jar file will not work and will break your application. </li>
+<!-- TODO: add when 0.10.1.2 is released
     <li> Upgrading from 0.10.0.x to 0.10.1.2 requires two rolling bounces with config
<code>upgrade.from="0.10.0"</code> set for first upgrade phase
          (cf. <a href="https://cwiki.apache.org/confluence/display/KAFKA/KIP-268%3A+Simplify+Kafka+Streams+Rebalance+Metadata+Upgrade">KIP-268</a>).
          As an alternative, an offline upgrade is also possible.
@@ -564,6 +565,7 @@ only support 0.10.1.x or later brokers while 0.10.1.x brokers also support
older
             <li> bounce each instance of your application once more to complete the
upgrade </li>
         </ul>
     </li>
+-->
     <li> Upgrading from 0.10.0.x to 0.10.1.0 or 0.10.1.1 requires an offline upgrade
(rolling bounce upgrade is not supported)
         <ul>
             <li> stop all old (0.10.0.x) application instances </li>
diff --git a/release.py b/release.py
index a21ffdc..0487f66 100755
--- a/release.py
+++ b/release.py
@@ -20,14 +20,31 @@
 """
 Utility for creating release candidates and promoting release candidates to a final relase.
 
-Usage: release.py
+Usage: release.py [subcommand]
 
-The utility is interactive; you will be prompted for basic release information and guided
through the process.
+release.py stage
+
+  Builds and stages an RC for a release.
+
+  The utility is interactive; you will be prompted for basic release information and guided
through the process.
+
+  This utility assumes you already have local a kafka git folder and that you
+  have added remotes corresponding to both:
+  (i) the github apache kafka mirror and
+  (ii) the apache kafka git repo.
+
+release.py stage-docs [kafka-site-path]
+
+  Builds the documentation and stages it into an instance of the Kafka website repository.
+
+  This is meant to automate the integration between the main Kafka website repository (https://github.com/apache/kafka-site)
+  and the versioned documentation maintained in the main Kafka repository. This is useful
both for local testing and
+  development of docs (follow the instructions here: https://cwiki.apache.org/confluence/display/KAFKA/Setup+Kafka+Website+on+Local+Apache+Server)
+  as well as for committers to deploy docs (run this script, then validate, commit, and push
to kafka-site).
+
+  With no arguments this script assumes you have the Kafka repository and kafka-site repository
checked out side-by-side, but
+  you can specify a full path to the kafka-site repository if this is not the case.
 
-This utility assumes you already have local a kafka git folder and that you
-have added remotes corresponding to both:
-(i) the github apache kafka mirror and
-(ii) the apache kafka git repo.
 """
 
 from __future__ import print_function
@@ -45,8 +62,8 @@ CAPITALIZED_PROJECT_NAME = "kafka".upper()
 SCRIPT_DIR = os.path.abspath(os.path.dirname(__file__))
 # Location of the local git repository
 REPO_HOME = os.environ.get("%s_HOME" % CAPITALIZED_PROJECT_NAME, SCRIPT_DIR)
-# Remote name which points to Apache git
-PUSH_REMOTE_NAME = os.environ.get("PUSH_REMOTE_NAME", "apache")
+# Remote name, which points to Github by default
+PUSH_REMOTE_NAME = os.environ.get("PUSH_REMOTE_NAME", "apache-github")
 PREFS_FILE = os.path.join(SCRIPT_DIR, '.release-settings.json')
 
 delete_gitrefs = False
@@ -77,6 +94,7 @@ def print_output(output):
 def cmd(action, cmd, *args, **kwargs):
     if isinstance(cmd, basestring) and not kwargs.get("shell", False):
         cmd = cmd.split()
+    allow_failure = kwargs.pop("allow_failure", False)
 
     stdin_log = ""
     if "stdin" in kwargs and isinstance(kwargs["stdin"], basestring):
@@ -93,6 +111,9 @@ def cmd(action, cmd, *args, **kwargs):
     except subprocess.CalledProcessError as e:
         print_output(e.output)
 
+        if allow_failure:
+            return
+
         print("*************************************************")
         print("*** First command failure occurred here.      ***")
         print("*** Will now try to clean up working state.   ***")
@@ -126,9 +147,9 @@ def sftp_mkdir(dir):
     try:
        cmd_str  = """
 cd %s
-mkdir %s
+-mkdir %s
 """ % (basedir, dirname)
-       cmd("Creating '%s' in '%s' in your Apache home directory if it does not exist (errors
are ok if the directory already exists)" % (dirname, basedir), "sftp -b - %s@home.apache.org"
% apache_id, stdin=cmd_str)
+       cmd("Creating '%s' in '%s' in your Apache home directory if it does not exist (errors
are ok if the directory already exists)" % (dirname, basedir), "sftp -b - %s@home.apache.org"
% apache_id, stdin=cmd_str, allow_failure=True)
     except subprocess.CalledProcessError:
         # This is ok. The command fails if the directory already exists
         pass
@@ -141,11 +162,113 @@ def get_pref(prefs, name, request_fn):
         prefs[name] = val
     return val
 
-# Load saved preferences
-prefs = {}
-if os.path.exists(PREFS_FILE):
-    with open(PREFS_FILE, 'r') as prefs_fp:
-        prefs = json.load(prefs_fp)
+def load_prefs():
+    """Load saved preferences"""
+    prefs = {}
+    if os.path.exists(PREFS_FILE):
+        with open(PREFS_FILE, 'r') as prefs_fp:
+            prefs = json.load(prefs_fp)
+    return prefs
+
+def save_prefs(prefs):
+    """Save preferences"""
+    print("Saving preferences to %s" % PREFS_FILE)
+    with open(PREFS_FILE, 'w') as prefs_fp:
+        prefs = json.dump(prefs, prefs_fp)
+
+def get_jdk(prefs, version):
+    """
+    Get settings for the specified JDK version.
+    """
+    jdk_java_home = get_pref(prefs, 'jdk%d' % version, lambda: raw_input("Enter the path
for JAVA_HOME for a JDK%d compiler (blank to use default JAVA_HOME): " % version))
+    jdk_env = dict(os.environ) if jdk_java_home.strip() else None
+    if jdk_env is not None: jdk_env['JAVA_HOME'] = jdk_java_home
+    if "1.%d.0" % version not in cmd_output("java -version", env=jdk_env):
+        fail("JDK %s is required" % version)
+    return jdk_env
+
+def get_version(repo=REPO_HOME):
+    """
+    Extracts the full version information as a str from gradle.properties
+    """
+    with open(os.path.join(repo, 'gradle.properties')) as fp:
+        for line in fp:
+            parts = line.split('=')
+            if parts[0].strip() != 'version': continue
+            return parts[1].strip()
+    fail("Couldn't extract version from gradle.properties")
+
+def docs_version(version):
+    """
+    Detects the major/minor version and converts it to the format used for docs on the website,
e.g. gets 0.10.2.0-SNAPSHOT
+    from gradle.properties and converts it to 0102
+    """
+    version_parts = version.strip().split('.')
+    # 1.0+ will only have 3 version components as opposed to pre-1.0 that had 4
+    major_minor = version_parts[0:3] if version_parts[0] == '0' else version_parts[0:2]
+    return ''.join(major_minor)
+
+def docs_release_version(version):
+    """
+    Detects the version from gradle.properties and converts it to a release version number
that should be valid for the
+    current release branch. For example, 0.10.2.0-SNAPSHOT would remain 0.10.2.0-SNAPSHOT
(because no release has been
+    made on that branch yet); 0.10.2.1-SNAPSHOT would be converted to 0.10.2.0 because 0.10.2.1
is still in development
+    but 0.10.2.0 should have already been released. Regular version numbers (e.g. as encountered
on a release branch)
+    will remain the same.
+    """
+    version_parts = version.strip().split('.')
+    if '-SNAPSHOT' in version_parts[-1]:
+        bugfix = int(version_parts[-1].split('-')[0])
+        if bugfix > 0:
+            version_parts[-1] = str(bugfix - 1)
+    return '.'.join(version_parts)
+
+def command_stage_docs():
+    kafka_site_repo_path = sys.argv[2] if len(sys.argv) > 2 else os.path.join(REPO_HOME,
'..', 'kafka-site')
+    if not os.path.exists(kafka_site_repo_path) or not os.path.exists(os.path.join(kafka_site_repo_path,
'powered-by.html')):
+        sys.exit("%s doesn't exist or does not appear to be the kafka-site repository" %
kafka_site_repo_path)
+
+    prefs = load_prefs()
+    jdk8_env = get_jdk(prefs, 8)
+    save_prefs(prefs)
+
+    version = get_version()
+    # We explicitly override the version of the project that we normally get from gradle.properties
since we want to be
+    # able to run this from a release branch where we made some updates, but the build would
show an incorrect SNAPSHOT
+    # version due to already having bumped the bugfix version number.
+    gradle_version_override = docs_release_version(version)
+
+    cmd("Building docs", "./gradlew -Pversion=%s clean releaseTarGzAll aggregatedJavadoc"
% gradle_version_override, cwd=REPO_HOME, env=jdk8_env)
+
+    docs_tar = os.path.join(REPO_HOME, 'core', 'build', 'distributions', 'kafka_2.11-%s-site-docs.tgz'
% gradle_version_override)
+
+    versioned_docs_path = os.path.join(kafka_site_repo_path, docs_version(version))
+    if not os.path.exists(versioned_docs_path):
+        os.mkdir(versioned_docs_path, 0755)
+
+    # The contents of the docs jar are site-docs/<docs dir>. We need to get rid of
the site-docs prefix and dump everything
+    # inside it into the docs version subdirectory in the kafka-site repo
+    cmd('Extracting site-docs', 'tar xf %s --strip-components 1' % docs_tar, cwd=versioned_docs_path)
+
+    javadocs_src_dir = os.path.join(REPO_HOME, 'build', 'docs', 'javadoc')
+
+    cmd('Copying javadocs', 'cp -R %s %s' % (javadocs_src_dir, versioned_docs_path))
+
+    sys.exit(0)
+
+
+# Dispatch to subcommand
+subcommand = sys.argv[1] if len(sys.argv) > 1 else None
+if subcommand == 'stage-docs':
+    command_stage_docs()
+elif not (subcommand is None or subcommand == 'stage'):
+    fail("Unknown subcommand: %s" % subcommand)
+# else -> default subcommand stage
+
+
+## Default 'stage' subcommand implementation isn't isolated to its own function yet for historical
reasons
+
+prefs = load_prefs()
 
 if not user_ok("""Requirements:
 1. Updated docs to reference the new release version where appropriate.
@@ -217,7 +340,7 @@ except ValueError:
 rc = raw_input("Release candidate number: ")
 
 dev_branch = '.'.join(release_version_parts[:2])
-docs_version = ''.join(release_version_parts[:2])
+docs_release_version = docs_version(release_version)
 
 # Validate that the release doesn't already exist and that the
 cmd("Fetching tags from upstream", 'git fetch --tags %s' % PUSH_REMOTE_NAME)
@@ -240,7 +363,6 @@ if not rc:
 # Prereq checks
 apache_id = get_pref(prefs, 'apache_id', lambda: raw_input("Enter your apache username: "))
 
-
 jdk7_java_home = get_pref(prefs, 'jdk7', lambda: raw_input("Enter the path for JAVA_HOME
for a JDK7 compiler (blank to use default JAVA_HOME): "))
 jdk7_env = dict(os.environ) if jdk7_java_home.strip() else None
 if jdk7_env is not None: jdk7_env['JAVA_HOME'] = jdk7_java_home
@@ -254,6 +376,7 @@ if "1.8.0" not in cmd_output("java -version", env=jdk8_env):
     fail("You must be able to build artifacts with JDK8 for Scala 2.12 artifacts")
 
 
+
 def select_gpg_key():
     print("Here are the available GPG keys:")
     available_keys = cmd_output("gpg --list-secret-keys")
@@ -271,10 +394,7 @@ with tempfile.NamedTemporaryFile() as gpg_test_tempfile:
     gpg_test_tempfile.write("abcdefg")
     cmd("Testing GPG key & passphrase", ["gpg", "--batch", "--pinentry-mode", "loopback",
"--passphrase-fd", "0", "-u", key_name, "--armor", "--output", gpg_test_tempfile.name + ".asc",
"--detach-sig", gpg_test_tempfile.name], stdin=gpg_passphrase)
 
-# Save preferences
-print("Saving preferences to %s" % PREFS_FILE)
-with open(PREFS_FILE, 'w') as prefs_fp:
-    prefs = json.dump(prefs, prefs_fp)
+save_prefs(prefs)
 
 # Generate RC
 try:
@@ -340,7 +460,7 @@ cmd("Creating source archive", "git archive --format tar.gz --prefix kafka-%(rel
 
 cmd("Building artifacts", "gradle", cwd=kafka_dir, env=jdk7_env)
 cmd("Building artifacts", "./gradlew clean releaseTarGzAll aggregatedJavadoc", cwd=kafka_dir,
env=jdk7_env)
-# we need extra cmd to build 2.12 with jdk8 specifically
+# This should be removed when Java7 is dropped (cf. KAFKA-4421)
 cmd("Building artifacts for Scala 2.12", "./gradlew releaseTarGz -PscalaVersion=2.12", cwd=kafka_dir,
env=jdk8_env)
 cmd("Copying artifacts", "cp %s/core/build/distributions/* %s" % (kafka_dir, artifacts_dir),
shell=True)
 cmd("Copying artifacts", "cp -R %s/build/docs/javadoc %s" % (kafka_dir, artifacts_dir))
@@ -372,16 +492,16 @@ sftp_cmds = ""
 for root, dirs, files in os.walk(artifacts_dir):
     assert root.startswith(artifacts_dir)
 
-    for file in files:
-        local_path = os.path.join(root, file)
-        remote_path = os.path.join("public_html", kafka_output_dir, root[len(artifacts_dir)+1:],
file)
-        sftp_cmds += "\nput %s %s" % (local_path, remote_path)
-
     for dir in dirs:
         sftp_mkdir(os.path.join("public_html", kafka_output_dir, root[len(artifacts_dir)+1:],
dir))
 
-if sftp_cmds:
-    cmd("Uploading artifacts in %s to your Apache home directory" % root, "sftp -b - %s@home.apache.org"
% apache_id, stdin=sftp_cmds)
+    for file in files:
+        local_path = os.path.join(root, file)
+        remote_path = os.path.join("public_html", kafka_output_dir, root[len(artifacts_dir)+1:],
file)
+        sftp_cmds = """
+put %s %s
+""" % (local_path, remote_path)
+        cmd("Uploading artifacts in %s to your Apache home directory" % root, "sftp -b -
%s@home.apache.org" % apache_id, stdin=sftp_cmds)
 
 with open(os.path.expanduser("~/.gradle/gradle.properties")) as f:
     contents = f.read()
@@ -389,14 +509,14 @@ if not user_ok("Going to build and upload mvn artifacts based on these
settings:
     fail("Retry again later")
 cmd("Building and uploading archives", "./gradlew uploadArchivesAll", cwd=kafka_dir, env=jdk7_env)
 cmd("Building and uploading archives", "./gradlew uploadCoreArchives_2_12 -PscalaVersion=2.12",
cwd=kafka_dir, env=jdk8_env)
-cmd("Building and uploading archives", "mvn deploy", cwd=streams_quickstart_dir, env=jdk7_env)
+cmd("Building and uploading archives", "mvn deploy -Pgpg-signing", cwd=streams_quickstart_dir,
env=jdk7_env)
 
 release_notification_props = { 'release_version': release_version,
                                'rc': rc,
                                'rc_tag': rc_tag,
                                'rc_githash': rc_githash,
                                'dev_branch': dev_branch,
-                               'docs_version': docs_version,
+                               'docs_version': docs_release_version,
                                'apache_id': apache_id,
                                }
 
@@ -474,7 +594,7 @@ https://repository.apache.org/content/groups/staging/
 http://home.apache.org/~%(apache_id)s/kafka-%(rc_tag)s/javadoc/
 
 * Tag to be voted upon (off %(dev_branch)s branch) is the %(release_version)s tag:
-https://git-wip-us.apache.org/repos/asf?p=kafka.git;a=tag;h=%(rc_githash)s
+https://github.com/apache/kafka/releases/tag/%(rc_tag)s
 
 * Documentation:
 http://kafka.apache.org/%(docs_version)s/documentation.html
@@ -484,7 +604,7 @@ http://kafka.apache.org/%(docs_version)s/protocol.html
 
 * Successful Jenkins builds for the %(dev_branch)s branch:
 Unit/integration tests: https://builds.apache.org/job/kafka-%(dev_branch)s-jdk7/<BUILD
NUMBER>/
-System tests: https://jenkins.confluent.io/job/system-test-kafka-%(dev_branch)s/<BUILD_NUMBER>/
+System tests: https://jenkins.confluent.io/job/system-test-kafka/job/%(dev_branch)s/<BUILD_NUMBER>/
 
 /**************************************
 


Mime
View raw message