kafka-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: Kafka-0.8 #39
Date Tue, 11 Sep 2012 17:16:59 GMT
See <https://builds.apache.org/job/Kafka-0.8/39/changes>

Changes:

[junrao] trival fix to remove a 0 byte file

------------------------------------------
[...truncated 5944 lines...]
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[info] Test Passed: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testConsumerEmptyTopic(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testConsumerEmptyTopic(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testFetchRequestCanProperlySerialize(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testFetchRequestCanProperlySerialize(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)
[2012-09-11 12:11:30,783] ERROR [KafkaApi on Broker 0], error when processing request (test2,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:46)
	at kafka.log.Log.read(Log.scala:282)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:365)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,784] ERROR [KafkaApi on Broker 0], error when processing request (test3,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:46)
	at kafka.log.Log.read(Log.scala:282)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:365)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,785] ERROR [KafkaApi on Broker 0], error when processing request (test4,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:46)
	at kafka.log.Log.read(Log.scala:282)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:365)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,787] ERROR [KafkaApi on Broker 0], error when processing request (test1,0,-1,10000)
(kafka.server.KafkaApis:102)
kafka.common.OffsetOutOfRangeException: offset -1 is out of range
	at kafka.log.Log$.findRange(Log.scala:46)
	at kafka.log.Log.read(Log.scala:282)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:365)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,795] ERROR [KafkaApi on Broker 0], error when processing request (test2,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test2 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:94)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:356)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,796] ERROR [KafkaApi on Broker 0], error when processing request (test3,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test3 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:94)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:356)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,798] ERROR [KafkaApi on Broker 0], error when processing request (test4,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test4 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:94)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:356)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[2012-09-11 12:11:30,799] ERROR [KafkaApi on Broker 0], error when processing request (test1,-1,0,10000)
(kafka.server.KafkaApis:102)
kafka.common.UnknownTopicOrPartitionException: Topic test1 partition -1 doesn't exist on 0
	at kafka.server.ReplicaManager.getLeaderReplicaIfLocal(ReplicaManager.scala:94)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:356)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:322)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1$$anonfun$apply$16.apply(KafkaApis.scala:318)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:318)
	at kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:314)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
	at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:32)
	at kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:314)
	at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:250)
	at kafka.server.KafkaApis.handle(KafkaApis.scala:59)
	at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:38)
	at java.lang.Thread.run(Thread.java:722)
[info] Test Passed: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testEmptyFetchRequest(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testEmptyFetchRequest(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduce(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduce(kafka.integration.PrimitiveApiTest)
[info] == core-kafka / kafka.integration.PrimitiveApiTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Starting: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Passed: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] Test Starting: testGetOffsetsBeforeEarliestTime(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeEarliestTime(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeLatestTime(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeLatestTime(kafka.log.LogOffsetTest)
[info] Test Starting: testEmptyLogsGetOffsets(kafka.log.LogOffsetTest)
[info] Test Passed: testEmptyLogsGetOffsets(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsForUnknownTopic(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsForUnknownTopic(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeNow(kafka.log.LogOffsetTest)
Offsets = 240,216,108,0
[info] Test Passed: testGetOffsetsBeforeNow(kafka.log.LogOffsetTest)
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] 
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] Test Starting: testResetToEarliestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToEarliestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testResetToLatestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToLatestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testResetToEarliestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToEarliestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testResetToLatestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToLatestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] 
[info] == core-kafka / kafka.server.ISRExpirationTest ==
[info] Test Starting: testISRExpirationForSlowFollowers(kafka.server.ISRExpirationTest)
[info] Test Passed: testISRExpirationForSlowFollowers(kafka.server.ISRExpirationTest)
[info] Test Starting: testISRExpirationForStuckFollowers(kafka.server.ISRExpirationTest)
[info] Test Passed: testISRExpirationForStuckFollowers(kafka.server.ISRExpirationTest)
[info] == core-kafka / kafka.server.ISRExpirationTest ==
[info] 
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] Test Starting: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Passed: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Starting: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] Test Passed: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] Test Starting: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Passed: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] 
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest
==
[info] Test Starting: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest
==
[info] 
[info] == core-kafka / kafka.consumer.ZookeeperConsumerConnectorTest ==
[info] Test Starting: testCompressionSetConsumption(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testCompressionSetConsumption(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testCompression(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testCompression(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testBasic(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testLeaderSelectionForPartition(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testLeaderSelectionForPartition(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testConsumerDecoder(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testConsumerDecoder(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.consumer.ZookeeperConsumerConnectorTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 138, Failed 1, Errors 0, Passed 137, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_a8a49ef0
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.message.CompressionUtilTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 379 s, completed Sep 11, 2012 12:12:12 PM
[info] 
[info] Total session time: 381 s, completed Sep 11, 2012 12:12:12 PM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Mime
View raw message