kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From j...@apache.org
Subject [kafka] branch 2.1 updated: MINOR: Adjust test params pursuant to KAFKA-4514. (#5777)
Date Wed, 10 Oct 2018 22:18:32 GMT
This is an automated email from the ASF dual-hosted git repository.

jgus pushed a commit to branch 2.1
in repository https://gitbox.apache.org/repos/asf/kafka.git


The following commit(s) were added to refs/heads/2.1 by this push:
     new 358b41c  MINOR: Adjust test params pursuant to KAFKA-4514. (#5777)
358b41c is described below

commit 358b41c9bf90a05cad18b1497fa5280473b7993a
Author: Colin Hicks <colin.hicks@gmail.com>
AuthorDate: Wed Oct 10 18:16:54 2018 -0400

    MINOR: Adjust test params pursuant to KAFKA-4514. (#5777)
    
    PR #2267 Introduced support for Zstandard compression. The relevant test expects values
for `num_nodes` and `num_producers` based on the (now-incremented) count of compression types.
    
    Passed the affected, previously-failing test:
    `ducker-ak test tests/kafkatest/tests/client/compression_test.py`
    
    Reviewers: Jason Gustafson <jason@confluent.io>
---
 tests/kafkatest/tests/client/compression_test.py | 11 ++++++-----
 1 file changed, 6 insertions(+), 5 deletions(-)

diff --git a/tests/kafkatest/tests/client/compression_test.py b/tests/kafkatest/tests/client/compression_test.py
index 2085d9b..23b30ea 100644
--- a/tests/kafkatest/tests/client/compression_test.py
+++ b/tests/kafkatest/tests/client/compression_test.py
@@ -29,6 +29,7 @@ class CompressionTest(ProduceConsumeValidateTest):
     """
     These tests validate produce / consume for compressed topics.
     """
+    COMPRESSION_TYPES = ["snappy", "gzip", "lz4", "zstd", "none"]
 
     def __init__(self, test_context):
         """:type test_context: ducktape.tests.test.TestContext"""
@@ -42,7 +43,7 @@ class CompressionTest(ProduceConsumeValidateTest):
         self.num_partitions = 10
         self.timeout_sec = 60
         self.producer_throughput = 1000
-        self.num_producers = 4
+        self.num_producers = len(self.COMPRESSION_TYPES)
         self.messages_per_producer = 1000
         self.num_consumers = 1
 
@@ -53,15 +54,15 @@ class CompressionTest(ProduceConsumeValidateTest):
         # Override this since we're adding services outside of the constructor
         return super(CompressionTest, self).min_cluster_size() + self.num_producers + self.num_consumers
 
-    @cluster(num_nodes=7)
-    @parametrize(compression_types=["snappy","gzip","lz4","zstd","none"])
+    @cluster(num_nodes=8)
+    @parametrize(compression_types=COMPRESSION_TYPES)
     def test_compressed_topic(self, compression_types):
         """Test produce => consume => validate for compressed topics
         Setup: 1 zk, 1 kafka node, 1 topic with partitions=10, replication-factor=1
 
         compression_types parameter gives a list of compression types (or no compression
if
-        "none"). Each producer in a VerifiableProducer group (num_producers = 4) will use
a
-        compression type from the list based on producer's index in the group.
+        "none"). Each producer in a VerifiableProducer group (num_producers = number of compression
+        types) will use a compression type from the list based on producer's index in the
group.
 
             - Produce messages in the background
             - Consume messages in the background


Mime
View raw message