See <https://builds.apache.org/job/Kafka-trunk/141/changes>
Changes:
[junrao] Require values in Utils.getTopic* methods to be positive; patched by Swapnil Ghike;
reviewed by Jun Rao; KAFKA-481
------------------------------------------
[...truncated 2288 lines...]
[2012-08-25 06:07:52,155] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-25 06:07:52,155] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:52,156] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:52,156] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:52,156] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-25 06:07:52,158] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testPartitionedSendToNewBrokerInExistingTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testDefaultPartitioner[0m
[2012-08-25 06:07:53,203] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,204] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:53,204] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,221] INFO Awaiting connections on port 50701 (kafka.network.Acceptor:130)
[2012-08-25 06:07:53,222] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath
(kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:53,222] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,227] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:67.195.138.60-1345874873222,host:67.195.138.60,port:50701
(kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,227] INFO Starting log flusher every 3000 ms with the following overrides
Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:53,227] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,228] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,228] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:53,229] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,232] INFO Awaiting connections on port 43800 (kafka.network.Acceptor:130)
[2012-08-25 06:07:53,232] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath
(kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:53,233] INFO Registering broker /brokers/ids/1 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,234] INFO Registering broker /brokers/ids/1 succeeded with id:1,creatorId:67.195.138.60-1345874873233,host:67.195.138.60,port:43800
(kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,234] INFO Starting log flusher every 3000 ms with the following overrides
Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:53,235] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,235] INFO Connected to localhost:50701 for producing (kafka.producer.SyncProducer:61)
[2012-08-25 06:07:53,236] INFO Connected to localhost:43800 for producing (kafka.producer.SyncProducer:61)
[2012-08-25 06:07:53,236] INFO Created log for 'test-topic'-0 (kafka.log.LogManager:61)
[2012-08-25 06:07:53,236] INFO Begin registering broker topic /brokers/topics/test-topic/0
with 4 partitions (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,236] INFO Created log for 'test-topic'-2 (kafka.log.LogManager:61)
[2012-08-25 06:07:53,237] INFO Begin registering broker topic /brokers/topics/test-topic/1
with 4 partitions (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,246] INFO End registering broker topic /brokers/topics/test-topic/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,246] INFO End registering broker topic /brokers/topics/test-topic/1 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,737] INFO Closing all async producers (kafka.producer.ProducerPool:61)
[2012-08-25 06:07:53,738] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,739] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,740] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,740] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,740] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-25 06:07:53,750] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,751] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,751] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,752] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,752] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,752] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-25 06:07:53,753] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testDefaultPartitioner[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.ProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testKafkaLog4jConfigs[0m
[2012-08-25 06:07:54,767] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:54,767] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:54,767] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:54,786] INFO Awaiting connections on port 60023 (kafka.network.Acceptor:130)
[2012-08-25 06:07:54,786] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath
(kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:54,786] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:54,794] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:67.195.138.60-1345874874787,host:67.195.138.60,port:60023
(kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:54,794] INFO Starting log flusher every 3000 ms with the following overrides
Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:54,795] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-25 06:07:54,795] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:54,796] INFO starting log cleaner every 300000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:54,798] INFO Awaiting connections on port 50143 (kafka.network.Acceptor:130)
[2012-08-25 06:07:54,798] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath
(kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:54,798] INFO Starting log flusher every 3000 ms with the following overrides
Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:54,799] INFO Kafka server started. (kafka.server.KafkaServer:61)
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (kafka.producer.ProducerPool).
log4j:WARN Please initialize the log4j system properly.
[0m[[0minfo[0m] [0mTest Passed: testKafkaLog4jConfigs[0m
[0m[[0minfo[0m] [0mTest Starting: testBrokerListLog4jAppends[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[0m[[0minfo[0m] [0mTest Passed: testBrokerListLog4jAppends[0m
[0m[[0minfo[0m] [0mTest Starting: testZkConnectLog4jAppends[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[0m[[0minfo[0m] [0mTest Passed: testZkConnectLog4jAppends[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.AsyncProducerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testProducerQueueSize[0m
Queue is full..
[0m[[0minfo[0m] [0mTest Passed: testProducerQueueSize[0m
[0m[[0minfo[0m] [0mTest Starting: testAddAfterQueueClosed[0m
[0m[[0minfo[0m] [0mTest Passed: testAddAfterQueueClosed[0m
[0m[[0minfo[0m] [0mTest Starting: testBatchSize[0m
[0m[[0minfo[0m] [0mTest Passed: testBatchSize[0m
[0m[[0minfo[0m] [0mTest Starting: testQueueTimeExpired[0m
[0m[[0minfo[0m] [0mTest Passed: testQueueTimeExpired[0m
[0m[[0minfo[0m] [0mTest Starting: testSenderThreadShutdown[0m
[0m[[0minfo[0m] [0mTest Passed: testSenderThreadShutdown[0m
[0m[[0minfo[0m] [0mTest Starting: testCollateEvents[0m
[0m[[0minfo[0m] [0mTest Passed: testCollateEvents[0m
[0m[[0minfo[0m] [0mTest Starting: testCollateAndSerializeEvents[0m
[0m[[0minfo[0m] [0mTest Passed: testCollateAndSerializeEvents[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.AsyncProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.AutoOffsetResetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[31merror[0m] [0mTest Failed: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
junit.framework.AssertionFailedError: expected:<0> but was:<3>
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.Assert.failNotEquals(Assert.java:277)
at junit.framework.Assert.assertEquals(Assert.java:64)
at junit.framework.Assert.assertEquals(Assert.java:195)
at junit.framework.Assert.assertEquals(Assert.java:201)
at kafka.integration.AutoOffsetResetTest.testLatestOffsetResetForward(AutoOffsetResetTest.scala:218)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at junit.framework.TestCase.runTest(TestCase.java:164)
at junit.framework.TestCase.runBare(TestCase.java:130)
at junit.framework.TestResult$1.protect(TestResult.java:110)
at junit.framework.TestResult.runProtected(TestResult.java:128)
at junit.framework.TestResult.run(TestResult.java:113)
at junit.framework.TestCase.run(TestCase.java:120)
at junit.framework.TestSuite.runTest(TestSuite.java:228)
at junit.framework.TestSuite.run(TestSuite.java:223)
at junit.framework.TestSuite.runTest(TestSuite.java:228)
at junit.framework.TestSuite.run(TestSuite.java:223)
at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
at sbt.TestRunner.run(TestFramework.scala:53)
at sbt.TestRunner.runTest$1(TestFramework.scala:67)
at sbt.TestRunner.run(TestFramework.scala:76)
at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
at sbt.NamedTestTask.run(TestFramework.scala:92)
at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
at sbt.impl.RunTask.runTask(RunTask.scala:85)
at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
at sbt.Control$.trapUnit(Control.scala:19)
at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.AutoOffsetResetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduce(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduce(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testConsumerNotExistTopic(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testConsumerNotExistTopic(kafka.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.producer.ProducerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testSend[0m
[0m[[0minfo[0m] [0mTest Passed: testSend[0m
[0m[[0minfo[0m] [0mTest Starting: testSendSingleMessage[0m
[0m[[0minfo[0m] [0mTest Passed: testSendSingleMessage[0m
[0m[[0minfo[0m] [0mTest Starting: testInvalidPartition[0m
[0m[[0minfo[0m] [0mTest Passed: testInvalidPartition[0m
[0m[[0minfo[0m] [0mTest Starting: testSyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Passed: testSyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Starting: testAsyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Passed: testAsyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Starting: testSyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Passed: testSyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Starting: testAsyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Passed: testAsyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Starting: testConfigBrokerPartitionInfoWithPartitioner[0m
[0m[[0minfo[0m] [0mTest Passed: testConfigBrokerPartitionInfoWithPartitioner[0m
[0m[[0minfo[0m] [0mTest Starting: testConfigBrokerPartitionInfo[0m
[0m[[0minfo[0m] [0mTest Passed: testConfigBrokerPartitionInfo[0m
[0m[[0minfo[0m] [0mTest Starting: testZKSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Passed: testZKSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testZKSendWithDeadBroker[0m
[0m[[0minfo[0m] [0mTest Passed: testZKSendWithDeadBroker[0m
[0m[[0minfo[0m] [0mTest Starting: testPartitionedSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Passed: testPartitionedSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testPartitionedSendToNewBrokerInExistingTopic[0m
[0m[[0minfo[0m] [0mTest Passed: testPartitionedSendToNewBrokerInExistingTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testDefaultPartitioner[0m
[0m[[0minfo[0m] [0mTest Passed: testDefaultPartitioner[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.producer.ProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.FileMessageSetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Passed: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testWriteTo[0m
[0m[[0minfo[0m] [0mTest Passed: testWriteTo[0m
[0m[[0minfo[0m] [0mTest Starting: testFileSize[0m
[0m[[0minfo[0m] [0mTest Passed: testFileSize[0m
[0m[[0minfo[0m] [0mTest Starting: testIterationOverPartialAndTruncation[0m
[0m[[0minfo[0m] [0mTest Passed: testIterationOverPartialAndTruncation[0m
[0m[[0minfo[0m] [0mTest Starting: testIterationDoesntChangePosition[0m
[0m[[0minfo[0m] [0mTest Passed: testIterationDoesntChangePosition[0m
[0m[[0minfo[0m] [0mTest Starting: testRead[0m
[0m[[0minfo[0m] [0mTest Passed: testRead[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.FileMessageSetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.CompressionUtilTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testSimpleCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Passed: testSimpleCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Starting: testComplexCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Passed: testComplexCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Starting: testSnappyCompressDecompressExplicit[0m
[0m[[0minfo[0m] [0mTest Passed: testSnappyCompressDecompressExplicit[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.CompressionUtilTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogCorruptionTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)[0m
This is good
[0m[[0minfo[0m] [0mTest Passed: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogCorruptionTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [0mDeleting directory /tmp/sbt_dc8c451e[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[31merror[0m] [0mFailed: : Total 140, Failed 2, Errors 0, Passed 138, Skipped 0[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[31merror[0m] [0mError running kafka.zk.ZKEphemeralTest: Test FAILED[0m
[0m[[31merror[0m] [0mError running kafka.integration.AutoOffsetResetTest: Test FAILED[0m
[0m[[31merror[0m] [0mError running test: One or more subtasks failed[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal time: 131 s, completed Aug 25, 2012 6:08:28 AM[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal session time: 132 s, completed Aug 25, 2012 6:08:28 AM[0m
[0m[[31merror[0m] [0mError during build.[0m
Build step 'Execute shell' marked build as failure
|