kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: Kafka-0.8 #113
Date Mon, 19 Nov 2012 06:09:42 GMT
See <https://builds.apache.org/job/Kafka-0.8/113/changes>

Changes:

[junrao] move shutting down of fetcher thread out of critical path; patched by Jun Rao; reviewed
by Neha Narkhede; KAFKA-612

------------------------------------------
[...truncated 2862 lines...]
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:359)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:14,085] ERROR [KafkaApi-0] error when processing request (test1,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test1 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:163)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:359)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[info] Test Passed: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduce(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduce(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testFetchRequestCanProperlySerialize(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testFetchRequestCanProperlySerialize(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testEmptyFetchRequest(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testEmptyFetchRequest(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)
[2012-11-19 06:09:18,635] ERROR [KafkaApi-0] error when processing request (test2,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: Request for offset -1 but we only have log segments
in the range 0 to 2.
	at kafka.log.Log.read(Log.scala:371)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:368)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,636] ERROR [KafkaApi-0] error when processing request (test3,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: Request for offset -1 but we only have log segments
in the range 0 to 2.
	at kafka.log.Log.read(Log.scala:371)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:368)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,636] ERROR [KafkaApi-0] error when processing request (test4,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: Request for offset -1 but we only have log segments
in the range 0 to 2.
	at kafka.log.Log.read(Log.scala:371)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:368)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,637] ERROR [KafkaApi-0] error when processing request (test1,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: Request for offset -1 but we only have log segments
in the range 0 to 2.
	at kafka.log.Log.read(Log.scala:371)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:368)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,639] ERROR [KafkaApi-0] error when processing request (test2,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test2 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:163)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:359)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,640] ERROR [KafkaApi-0] error when processing request (test3,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test3 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:163)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:359)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,640] ERROR [KafkaApi-0] error when processing request (test4,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test4 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:163)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:359)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[2012-11-19 06:09:18,641] ERROR [KafkaApi-0] error when processing request (test1,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test1 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:163)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:359)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:325)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:321)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
	at scala.collection.immutable.Map$Map4.map(Map.scala:157)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:321)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:289)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:57)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:41)
	at java.lang.Thread.run(Thread.java:662)
[info] Test Passed: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testConsumerEmptyTopic(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testConsumerEmptyTopic(kafka.integration.PrimitiveApiTest)
[info] == core-kafka / kafka.integration.PrimitiveApiTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Starting: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Passed: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.metrics.KafkaTimerTest ==
[info] Test Starting: testKafkaTimer(kafka.metrics.KafkaTimerTest)
[info] Test Passed: testKafkaTimer(kafka.metrics.KafkaTimerTest)
[info] == core-kafka / kafka.metrics.KafkaTimerTest ==
[info] 
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testValidBytesWithCompression
[info] Test Passed: testValidBytesWithCompression
[info] Test Starting: testIterator
[info] Test Passed: testIterator
[info] Test Starting: testOffsetAssignment
[info] Test Passed: testOffsetAssignment
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] Test Starting: testUpdateBrokerPartitionInfo(kafka.producer.ProducerTest)
[info] Test Passed: testUpdateBrokerPartitionInfo(kafka.producer.ProducerTest)
[info] Test Starting: testSendToNewTopic(kafka.producer.ProducerTest)
[info] Test Passed: testSendToNewTopic(kafka.producer.ProducerTest)
[info] Test Starting: testSendWithDeadBroker(kafka.producer.ProducerTest)
[info] Test Passed: testSendWithDeadBroker(kafka.producer.ProducerTest)
[info] Test Starting: testAsyncSendCanCorrectlyFailWithTimeout(kafka.producer.ProducerTest)
[info] Test Passed: testAsyncSendCanCorrectlyFailWithTimeout(kafka.producer.ProducerTest)
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] 
[info] == core-kafka / kafka.integration.FetcherTest ==
[info] Test Starting: testFetcher(kafka.integration.FetcherTest)
[info] Test Passed: testFetcher(kafka.integration.FetcherTest)
[info] == core-kafka / kafka.integration.FetcherTest ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_b5aaac46
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 167, Failed 2, Errors 0, Passed 165, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[info] 
[info] == hadoop consumer / copy-test-resources ==
[info] == hadoop consumer / copy-test-resources ==
[error] Error running kafka.server.LogRecoveryTest: Test FAILED
[error] Error running kafka.admin.AdminTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 233 s, completed Nov 19, 2012 6:09:42 AM
[info] 
[info] Total session time: 233 s, completed Nov 19, 2012 6:09:42 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Mime
View raw message