kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: Kafka-0.8 #56
Date Wed, 03 Oct 2012 20:13:56 GMT
See <https://builds.apache.org/job/Kafka-0.8/56/changes>

Changes:

[jjkoshy] Replace numerical compression codes in config with something human readable; KAFKA-363;
patched by David Arthur; reviewed by Joel Koshy

------------------------------------------
[...truncated 3049 lines...]
[2012-10-03 20:13:33,421] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0007, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:33,821] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0005, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:34,641] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a28420ec60003, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:34,670] ERROR [group1_consumer2], error during syncedRebalance (kafka.consumer.ZookeeperConsumerConnector:102)
java.lang.NullPointerException
	at kafka.utils.ZkUtils$.getChildrenParentMayNotExist(ZkUtils.scala:374)
	at kafka.utils.ZkUtils$.getCluster(ZkUtils.scala:392)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$syncedRebalance$1.apply$mcVI$sp(ZookeeperConsumerConnector.scala:385)
	at scala.collection.immutable.Range$ByOne$class.foreach$mVc$sp(Range.scala:282)
	at scala.collection.immutable.Range$$anon$2.foreach$mVc$sp(Range.scala:265)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener.syncedRebalance(ZookeeperConsumerConnector.scala:382)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anon$1.run(ZookeeperConsumerConnector.scala:339)
[2012-10-03 20:13:34,921] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842ab0a0002, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:34,991] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0007, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:35,161] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842a1f70002, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:35,561] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0005, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:35,971] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a28420ec60003, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:36,441] WARN Exception causing close of session 0x13a2842ab0a0002 due to
java.io.IOException: Connection reset by peer (org.apache.zookeeper.server.NIOServerCnxn:639)
[2012-10-03 20:13:36,442] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842a1f70002, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:36,844] ERROR [KafkaApi-0] error when processing request (test2,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:48)
	at kafka.log.Log.read(Log.scala:292)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:335)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:295)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:292)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:292)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:229)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:40)
	at java.lang.Thread.run(Thread.java:662)
[2012-10-03 20:13:36,845] ERROR [KafkaApi-0] error when processing request (test3,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:48)
	at kafka.log.Log.read(Log.scala:292)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:335)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:295)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:292)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:292)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:229)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:40)
	at java.lang.Thread.run(Thread.java:662)
[2012-10-03 20:13:36,845] ERROR [KafkaApi-0] error when processing request (test4,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:48)
	at kafka.log.Log.read(Log.scala:292)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:335)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:295)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:292)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:292)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:229)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:40)
	at java.lang.Thread.run(Thread.java:662)
[2012-10-03 20:13:36,845] ERROR [KafkaApi-0] error when processing request (test1,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:48)
	at kafka.log.Log.read(Log.scala:292)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:335)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:295)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:292)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:292)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:229)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:40)
	at java.lang.Thread.run(Thread.java:662)
[info] Test Passed: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testMultiProduce(kafka.integration.LazyInitProducerTest)
[2012-10-03 20:13:36,941] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0005, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:37,041] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0007, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:37,190] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a28420ec60003, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:37,751] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842ab0a0002, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[info] Test Passed: testMultiProduce(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[2012-10-03 20:13:38,350] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a28420ec60003, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:38,621] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0007, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:38,831] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0005, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:39,143] ERROR [KafkaApi-0] error when processing request (test,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:48)
	at kafka.log.Log.read(Log.scala:292)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:335)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:295)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:292)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map1.foreach(Map.scala:105)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map1.map(Map.scala:93)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:292)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:229)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:40)
	at java.lang.Thread.run(Thread.java:662)
[info] Test Passed: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[2012-10-03 20:13:39,291] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842a1f70002, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:39,871] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842ab0a0002, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[info] Test Passed: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[info] == core-kafka / kafka.integration.LazyInitProducerTest ==
[info] 
[info] == core-kafka / kafka.network.RpcDataSerializationTest ==
[info] Test Starting: testSerializationAndDeserialization
[info] Test Passed: testSerializationAndDeserialization
[info] == core-kafka / kafka.network.RpcDataSerializationTest ==
[info] 
[info] == core-kafka / kafka.admin.AdminTest ==
[info] Test Starting: testReplicaAssignment(kafka.admin.AdminTest)
[info] Test Passed: testReplicaAssignment(kafka.admin.AdminTest)
[info] Test Starting: testManualReplicaAssignment(kafka.admin.AdminTest)
[info] Test Passed: testManualReplicaAssignment(kafka.admin.AdminTest)
[info] Test Starting: testTopicCreationInZK(kafka.admin.AdminTest)
[2012-10-03 20:13:40,435] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a28420ec60003, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[2012-10-03 20:13:40,471] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0005, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
[info] Test Passed: testTopicCreationInZK(kafka.admin.AdminTest)
[info] Test Starting: testGetTopicMetadata(kafka.admin.AdminTest)
[info] Test Passed: testGetTopicMetadata(kafka.admin.AdminTest)
[info] == core-kafka / kafka.admin.AdminTest ==
[info] 
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] Test Starting: testSwallow
[info] Test Passed: testSwallow
[info] Test Starting: testCircularIterator
[info] Test Passed: testCircularIterator
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
[2012-10-03 20:13:40,691] WARN EndOfStreamException: Unable to read additional data from client
sessionid 0x13a2842db1c0007, likely client has closed socket (org.apache.zookeeper.server.NIOServerCnxn:634)
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Starting: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Passed: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] Test Starting: testGetOffsetsForUnknownTopic(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsForUnknownTopic(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeLatestTime(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeLatestTime(kafka.log.LogOffsetTest)
[info] Test Starting: testEmptyLogsGetOffsets(kafka.log.LogOffsetTest)
[info] Test Passed: testEmptyLogsGetOffsets(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeNow(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeNow(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeEarliestTime(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeEarliestTime(kafka.log.LogOffsetTest)
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] 
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest
==
[info] Test Starting: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest
==
[info] 
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testSmallFetchSize
[info] Test Passed: testSmallFetchSize
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testValidBytesWithCompression
[info] Test Passed: testValidBytesWithCompression
[info] Test Starting: testIterator
[info] Test Passed: testIterator
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] Test Starting: testUpdateBrokerPartitionInfo(kafka.producer.ProducerTest)
[info] Test Passed: testUpdateBrokerPartitionInfo(kafka.producer.ProducerTest)
[info] Test Starting: testSendToNewTopic(kafka.producer.ProducerTest)
[info] Test Passed: testSendToNewTopic(kafka.producer.ProducerTest)
[info] Test Starting: testSendWithDeadBroker(kafka.producer.ProducerTest)
[info] Test Passed: testSendWithDeadBroker(kafka.producer.ProducerTest)
[info] Test Starting: testAsyncSendCanCorrectlyFailWithTimeout(kafka.producer.ProducerTest)
[info] Test Passed: testAsyncSendCanCorrectlyFailWithTimeout(kafka.producer.ProducerTest)
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] 
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testFileSize
[info] Test Passed: testFileSize
[info] Test Starting: testIterationOverPartialAndTruncation
[info] Test Passed: testIterationOverPartialAndTruncation
[info] Test Starting: testIterationDoesntChangePosition
[info] Test Passed: testIterationDoesntChangePosition
[info] Test Starting: testRead
[info] Test Passed: testRead
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] 
[info] == core-kafka / kafka.log.LogCorruptionTest ==
[info] Test Starting: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)
[info] Test Passed: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)
[info] == core-kafka / kafka.log.LogCorruptionTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 136, Failed 1, Errors 0, Passed 135, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /var/tmp/sbt_17529a5e
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.message.CompressionUtilTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 210 s, completed Oct 3, 2012 8:14:22 PM
[info] 
[info] Total session time: 211 s, completed Oct 3, 2012 8:14:22 PM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Mime
View raw message