systemml-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Glenn Weidner" <gweid...@us.ibm.com>
Subject Re: Build failed in Jenkins: SystemML-DailyTest #761
Date Sat, 21 Jan 2017 19:33:57 GMT

The FullReblock test last failed on January 2.  I had saved those logs and
today uploaded the information to:

https://issues.apache.org/jira/browse/SYSTEMML-1110

Here's the seed reported for the
org.apache.sysml.test.integration.functions.data.FullReblockTest.testCSVSingeMSparseMR
 failure:

17/01/02 14:38:32 ERROR data.FullReblockTest: FullReblockTest failed with
seed=4742759188694491, seed2=4742759188694830

In regards to the functions.indexing.RightIndexingMatrixTest, I also saved
the logs for both failed builds 760 and 761.  That information has been
uploaded to new JIRA:

https://issues.apache.org/jira/browse/SYSTEMML-1189

For easier reference, I grouped these JIRAs including similar resolved bug
https://issues.apache.org/jira/browse/SYSTEMML-541 under:

https://issues.apache.org/jira/browse/SYSTEMML-1188

Thanks,
Glenn



From:	Matthias Boehm <mboehm7@googlemail.com>
To:	dev@systemml.incubator.apache.org
Date:	01/21/2017 02:21 AM
Subject:	Re: Build failed in Jenkins: SystemML-DailyTest #761



Let's keep the test, collect the used seeds, and fix it. The issue is
due to a randomly generated seed which hints at an underlying problem
for certain special cases. Btw, FullReblockTest has a similar issue.

Since I don't have access to the sparktc jenkins infrastructure, it
would be great if someone with access could simply share the seeds and
I'll have a look.

Regards,
Matthias

On 1/21/2017 1:58 AM, Deron Eriksson wrote:
> +1. It would be great to have this randomly failing test fixed as it
leads
> to a lot of unnecessary Jenkins 'build failed' messages. Could we please
> fix or remove it?
>
> Deron
>
>
> On Fri, Jan 20, 2017 at 4:54 PM, <dusenberrymw@gmail.com> wrote:
>
>> This right indexing test has been failing randomly for a while now.
Does
>> anyone have any thoughts on what the problem is and how to fix it?
>>
>> --
>>
>> Mike Dusenberry
>> GitHub: github.com/dusenberrymw
>> LinkedIn: linkedin.com/in/mikedusenberry
>>
>> Sent from my iPhone.
>>
>>
>>> On Jan 20, 2017, at 3:30 PM, jenkins@spark.tc wrote:
>>>
>>> See <https://sparktc.ibmcloud.com/jenkins/job/SystemML-
>> DailyTest/761/changes>
>>>
>>> Changes:
>>>
>>> [mwdusenb] [SYSTEMML-1185] SystemML Breast Cancer Project
>>>
>>> [mwdusenb] [MINOR] Update RAT Checks to Exclude `.keep` files.
>>>
>>> ------------------------------------------
>>> [...truncated 9545 lines...]
>>> 17/01/20 15:52:53 INFO scheduler.DAGScheduler: failed: Set()
>>> 17/01/20 15:52:53 INFO scheduler.DAGScheduler: Submitting ResultStage
>> 243 (MapPartitionsRDD[669] at collectAsList at MLContextTest.java:1337),
>> which has no missing parents
>>> 17/01/20 15:52:53 INFO memory.MemoryStore: Block broadcast_243 stored
as
>> values in memory (estimated size 10.5 KB, free 1045.4 MB)
>>> 17/01/20 15:52:53 INFO memory.MemoryStore: Block broadcast_243_piece0
>> stored as bytes in memory (estimated size 5.3 KB, free 1045.4 MB)
>>> 17/01/20 15:52:53 INFO storage.BlockManagerInfo: Added
>> broadcast_243_piece0 in memory on localhost:57525 (size: 5.3 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:53 INFO spark.SparkContext: Created broadcast 243 from
>> broadcast at DAGScheduler.scala:1012
>>> 17/01/20 15:52:53 INFO scheduler.DAGScheduler: Submitting 1 missing
>> tasks from ResultStage 243 (MapPartitionsRDD[669] at collectAsList at
>> MLContextTest.java:1337)
>>> 17/01/20 15:52:53 INFO scheduler.TaskSchedulerImpl: Adding task set
>> 243.0 with 1 tasks
>>> 17/01/20 15:52:53 INFO scheduler.FairSchedulableBuilder: Added task set
>> TaskSet_243 tasks to pool default
>>> 17/01/20 15:52:53 INFO scheduler.TaskSetManager: Starting task 0.0 in
>> stage 243.0 (TID 253, localhost, partition 0, NODE_LOCAL, 5202 bytes)
>>> 17/01/20 15:52:53 INFO executor.Executor: Running task 0.0 in stage
>> 243.0 (TID 253)
>>> 17/01/20 15:52:53 INFO storage.ShuffleBlockFetcherIterator: Getting 1
>> non-empty blocks out of 1 blocks
>>> 17/01/20 15:52:53 INFO storage.ShuffleBlockFetcherIterator: Started 0
>> remote fetches in 0 ms
>>> 17/01/20 15:52:54 INFO executor.Executor: Finished task 0.0 in stage
>> 243.0 (TID 253). 1855 bytes result sent to driver
>>> 17/01/20 15:52:54 INFO scheduler.TaskSetManager: Finished task 0.0 in
>> stage 243.0 (TID 253) in 8 ms on localhost (1/1)
>>> 17/01/20 15:52:54 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>> 243.0, whose tasks have all completed, from pool default
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: ResultStage 243
>> (collectAsList at MLContextTest.java:1337) finished in 0.008 s
>>> 17/01/20 15:52:54 INFO storage.BlockManagerInfo: Removed
>> broadcast_243_piece0 on localhost:57525 in memory (size: 5.3 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:54 INFO spark.ContextCleaner: Cleaned accumulator 14817
>>> 17/01/20 15:52:54 INFO spark.ContextCleaner: Cleaned shuffle 62
>>> 17/01/20 15:52:54 INFO storage.BlockManagerInfo: Removed
>> broadcast_242_piece0 on localhost:57525 in memory (size: 2.2 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Registering RDD 671
>> (flatMapToPair at RDDConverterUtils.java:273)
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Got job 181
>> (collectAsList at MLContextTest.java:1225) with 1 output partitions
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Final stage: ResultStage
>> 245 (collectAsList at MLContextTest.java:1225)
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Parents of final stage:
>> List(ShuffleMapStage 244)
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Missing parents:
>> List(ShuffleMapStage 244)
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Submitting
>> ShuffleMapStage 244 (MapPartitionsRDD[671] at flatMapToPair at
>> RDDConverterUtils.java:273), which has no missing parents
>>> 17/01/20 15:52:54 INFO memory.MemoryStore: Block broadcast_244 stored
as
>> values in memory (estimated size 3.9 KB, free 1045.5 MB)
>>> 17/01/20 15:52:54 INFO memory.MemoryStore: Block broadcast_244_piece0
>> stored as bytes in memory (estimated size 2.2 KB, free 1045.5 MB)
>>> 17/01/20 15:52:54 INFO storage.BlockManagerInfo: Added
>> broadcast_244_piece0 in memory on localhost:57525 (size: 2.2 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:54 INFO spark.SparkContext: Created broadcast 244 from
>> broadcast at DAGScheduler.scala:1012
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Submitting 1 missing
>> tasks from ShuffleMapStage 244 (MapPartitionsRDD[671] at flatMapToPair
at
>> RDDConverterUtils.java:273)
>>> 17/01/20 15:52:54 INFO scheduler.TaskSchedulerImpl: Adding task set
>> 244.0 with 1 tasks
>>> 17/01/20 15:52:54 INFO scheduler.FairSchedulableBuilder: Added task set
>> TaskSet_244 tasks to pool default
>>> 17/01/20 15:52:54 INFO scheduler.TaskSetManager: Starting task 0.0 in
>> stage 244.0 (TID 254, localhost, partition 0, PROCESS_LOCAL, 5637 bytes)
>>> 17/01/20 15:52:54 INFO executor.Executor: Running task 0.0 in stage
>> 244.0 (TID 254)
>>> 17/01/20 15:52:54 INFO executor.Executor: Finished task 0.0 in stage
>> 244.0 (TID 254). 1079 bytes result sent to driver
>>> 17/01/20 15:52:54 INFO scheduler.TaskSetManager: Finished task 0.0 in
>> stage 244.0 (TID 254) in 5 ms on localhost (1/1)
>>> 17/01/20 15:52:54 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>> 244.0, whose tasks have all completed, from pool default
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: ShuffleMapStage 244
>> (flatMapToPair at RDDConverterUtils.java:273) finished in 0.006 s
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: looking for newly
>> runnable stages
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: running: Set()
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: waiting: Set(ResultStage
>> 245)
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: failed: Set()
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Submitting ResultStage
>> 245 (MapPartitionsRDD[677] at collectAsList at MLContextTest.java:1225),
>> which has no missing parents
>>> 17/01/20 15:52:54 INFO memory.MemoryStore: Block broadcast_245 stored
as
>> values in memory (estimated size 10.5 KB, free 1045.4 MB)
>>> 17/01/20 15:52:54 INFO memory.MemoryStore: Block broadcast_245_piece0
>> stored as bytes in memory (estimated size 5.3 KB, free 1045.4 MB)
>>> 17/01/20 15:52:54 INFO storage.BlockManagerInfo: Added
>> broadcast_245_piece0 in memory on localhost:57525 (size: 5.3 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:54 INFO spark.SparkContext: Created broadcast 245 from
>> broadcast at DAGScheduler.scala:1012
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: Submitting 1 missing
>> tasks from ResultStage 245 (MapPartitionsRDD[677] at collectAsList at
>> MLContextTest.java:1225)
>>> 17/01/20 15:52:54 INFO scheduler.TaskSchedulerImpl: Adding task set
>> 245.0 with 1 tasks
>>> 17/01/20 15:52:54 INFO scheduler.FairSchedulableBuilder: Added task set
>> TaskSet_245 tasks to pool default
>>> 17/01/20 15:52:54 INFO scheduler.TaskSetManager: Starting task 0.0 in
>> stage 245.0 (TID 255, localhost, partition 0, NODE_LOCAL, 5202 bytes)
>>> 17/01/20 15:52:54 INFO executor.Executor: Running task 0.0 in stage
>> 245.0 (TID 255)
>>> 17/01/20 15:52:54 INFO storage.ShuffleBlockFetcherIterator: Getting 1
>> non-empty blocks out of 1 blocks
>>> 17/01/20 15:52:54 INFO storage.ShuffleBlockFetcherIterator: Started 0
>> remote fetches in 0 ms
>>> 17/01/20 15:52:54 INFO executor.Executor: Finished task 0.0 in stage
>> 245.0 (TID 255). 1855 bytes result sent to driver
>>> 17/01/20 15:52:54 INFO scheduler.TaskSetManager: Finished task 0.0 in
>> stage 245.0 (TID 255) in 8 ms on localhost (1/1)
>>> 17/01/20 15:52:54 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>> 245.0, whose tasks have all completed, from pool default
>>> 17/01/20 15:52:54 INFO scheduler.DAGScheduler: ResultStage 245
>> (collectAsList at MLContextTest.java:1225) finished in 0.010 s
>>> 17/01/20 15:52:55 INFO storage.BlockManagerInfo: Removed
>> broadcast_244_piece0 on localhost:57525 in memory (size: 2.2 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:55 INFO storage.BlockManagerInfo: Removed
>> broadcast_245_piece0 on localhost:57525 in memory (size: 5.3 KB, free:
>> 1045.8 MB)
>>> 17/01/20 15:52:55 INFO spark.ContextCleaner: Cleaned accumulator 14906
>>> 17/01/20 15:52:55 INFO spark.ContextCleaner: Cleaned shuffle 63
>>> 17/01/20 15:52:56 INFO server.ServerConnector: Stopped
>> ServerConnector@2d96d835{HTTP/1.1}{0.0.0.0:4040}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@666a86e3{/stages/stage/kill,
>> null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@4a336081{/api,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@7676a795{/,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@12cbf589{/static,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@ba4ea3e{/executors/threadDump/
>> json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@4b7ef0fa{/executors/
>> threadDump,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@5f1779dc{/executors/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@473a0bb{/executors,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@78d719a2
{/environment/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@154037ab{/environment,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@234f0e1f
{/storage/rdd/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@6f86ed{/storage/rdd,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@68d7ee7{/storage/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@6cdb4c59{/storage,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@425c3be4
{/stages/pool/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@1342b4d7{/stages/pool,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@6721b3a6{/stages/stage/json,
>> null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@6818d620{/stages/stage,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@91b0a99{/stages/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@4eb74419{/stages,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@f8016cb{/jobs/job/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@395b0735{/jobs/job,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@88fc7a{/jobs/json,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO handler.ContextHandler: Stopped
>> o.s.j.s.ServletContextHandler@319e1d07{/jobs,null,UNAVAILABLE}
>>> 17/01/20 15:52:56 INFO ui.SparkUI: Stopped Spark web UI at
>> http://localhost:4040
>>> 17/01/20 15:52:56 INFO spark.MapOutputTrackerMasterEndpoint:
>> MapOutputTrackerMasterEndpoint stopped!
>>> 17/01/20 15:52:56 INFO memory.MemoryStore: MemoryStore cleared
>>> 17/01/20 15:52:56 INFO storage.BlockManager: BlockManager stopped
>>> 17/01/20 15:52:56 INFO storage.BlockManagerMaster: BlockManagerMaster
>> stopped
>>> 17/01/20 15:52:56 INFO scheduler.OutputCommitCoordinator$
>> OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
>>> 17/01/20 15:52:56 INFO spark.SparkContext: Successfully stopped
>> SparkContext
>>> Running org.apache.sysml.test.integration.mlcontext.MLContextTest
>>> Tests run: 161, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
56.358
>> sec - in org.apache.sysml.test.integration.mlcontext.MLContextTest
>>> 17/01/20 15:52:56 INFO util.ShutdownHookManager: Shutdown hook called
>>> 17/01/20 15:52:56 INFO util.ShutdownHookManager: Deleting directory
>> /tmp/spark-a488339d-1cf3-4da1-beff-6d1545755b35
>>> Running org.apache.sysml.test.integration.applications.dml.ArimaDMLTest
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 195.712
>> sec - in org.apache.sysml.test.integration.applications.dml.ArimaDMLTest
>>> Running org.apache.sysml.test.integration.functions.append.
>> AppendVectorTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 377.892
>> sec - in org.apache.sysml.test.integration.functions.append.
>> AppendVectorTest
>>> Running org.apache.sysml.test.integration.functions.
>> quaternary.WeightedSigmoidTest
>>> Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
985.809
>> sec - in org.apache.sysml.test.integration.functions.
>> quaternary.WeightedSigmoidTest
>>> 17/01/20 15:54:24 INFO util.ShutdownHookManager: Shutdown hook called
>>> 17/01/20 15:54:24 INFO util.ShutdownHookManager: Deleting directory
>> /tmp/spark-865c2a67-fd15-4eae-b1f8-69cbc2cbf8d5
>>> Running org.apache.sysml.test.integration.functions.
>> indexing.LeftIndexingTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 652.628
>> sec - in org.apache.sysml.test.integration.functions.
>> indexing.LeftIndexingTest
>>> Running org.apache.sysml.test.integration.applications.
>> parfor.ParForCorrelationTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
260.959
>> sec - in org.apache.sysml.test.integration.applications.
>> parfor.ParForCorrelationTest
>>> Running
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateUnweightedScaleDenseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
324.842
>> sec - in
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateUnweightedScaleDenseTest
>>> Running org.apache.sysml.test.integration.applications.parfor.
>> ParForCorrelationTestLarge
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.526
>> sec - in org.apache.sysml.test.integration.applications.parfor.
>> ParForCorrelationTestLarge
>>> Running org.apache.sysml.test.integration.functions.reorg.FullOrderTest
>>> Tests run: 132, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>> 1,362.813 sec - in org.apache.sysml.test.integration.functions.reorg.
>> FullOrderTest
>>> Running org.apache.sysml.test.integration.functions.quaternary.
>> WeightedDivMatrixMultTest
>>> Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>> 1,476.652 sec - in
org.apache.sysml.test.integration.functions.quaternary.
>> WeightedDivMatrixMultTest
>>> 17/01/20 16:01:59 INFO util.ShutdownHookManager: Shutdown hook called
>>> 17/01/20 16:01:59 INFO util.ShutdownHookManager: Deleting directory
>> /tmp/spark-0da69e49-ab06-4b24-b6e7-e6eb21e07600
>>> Running
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateWeightedScaleDenseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
342.355
>> sec - in
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateWeightedScaleDenseTest
>>> Running org.apache.sysml.test.integration.functions.append.
>> AppendChainTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>> 1,075.082 sec - in org.apache.sysml.test.integration.functions.append.
>> AppendChainTest
>>> Running org.apache.sysml.test.integration.applications.
>> parfor.ParForNaiveBayesTest
>>> Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 521.13
>> sec - in org.apache.sysml.test.integration.applications.
>> parfor.ParForNaiveBayesTest
>>> Running
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateUnweightedScaleSparseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
280.814
>> sec - in
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateUnweightedScaleSparseTest
>>> Running org.apache.sysml.test.integration.functions.append.
>> AppendMatrixTest
>>> Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>> 1,497.48 sec - in org.apache.sysml.test.integration.functions.append.
>> AppendMatrixTest
>>> Running
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateWeightedScaleSparseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
350.107
>> sec - in
org.apache.sysml.test.integration.applications.descriptivestats.
>> UnivariateWeightedScaleSparseTest
>>> Running org.apache.sysml.test.integration.applications.dml.HITSDMLTest
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 156.481
>> sec - in org.apache.sysml.test.integration.applications.dml.HITSDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.ArimaPyDMLTest
>>> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.5
sec
>> - in org.apache.sysml.test.integration.applications.pydml.ArimaPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.HITSPyDMLTest
>>> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 78.545
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.HITSPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> LinearRegressionDMLTest
>>> Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
102.099
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> LinearRegressionDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.CsplineDSPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.965
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.CsplineDSPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.LinearLogRegPyDMLTest
>>> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.725
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.LinearLogRegPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.NaiveBayesPyDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.071
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.NaiveBayesPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> CsplineCGDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.608
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> CsplineCGDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest
>>> Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
128.329
>> sec - in org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>> MDABivariateStatsPyDMLTest
>>> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 294.376
>> sec - in org.apache.sysml.test.integration.applications.pydml.
>> MDABivariateStatsPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.PageRankPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.306
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.PageRankPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>> NaiveBayesParforPyDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.436
>> sec - in org.apache.sysml.test.integration.applications.pydml.
>> NaiveBayesParforPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> NaiveBayesDMLTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
163.066
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> NaiveBayesDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> MDABivariateStatsDMLTest
>>> Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 612.506
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> MDABivariateStatsDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> PageRankDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.145
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> PageRankDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.GLMPyDMLTest
>>> Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
589.644
>> sec - in
org.apache.sysml.test.integration.applications.pydml.GLMPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.MultiClassSVMPyDMLTest
>>> Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 87.987
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.MultiClassSVMPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.ID3DMLTest
>>> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 655.197
>> sec - in org.apache.sysml.test.integration.applications.dml.ID3DMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.GLMDMLTest
>>> Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>> 1,432.522 sec - in org.apache.sysml.test.integration.applications.dml.
>> GLMDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.CsplineCGPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.283
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.CsplineCGPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.WelchTPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.687
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.WelchTPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> MultiClassSVMDMLTest
>>> Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
178.477
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> MultiClassSVMDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> CsplineDSDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.703
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> CsplineDSDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.ApplyTransformPyDMLTest
>>> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.425
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.ApplyTransformPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> NaiveBayesParforDMLTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
162.851
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> NaiveBayesParforDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.L2SVMPyDMLTest
>>> Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.377
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.L2SVMPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>> LinearRegressionPyDMLTest
>>> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.406
>> sec - in org.apache.sysml.test.integration.applications.pydml.
>> LinearRegressionPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> LinearLogRegDMLTest
>>> Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
122.197
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> LinearLogRegDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.GNMFDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 116.917
>> sec - in org.apache.sysml.test.integration.applications.dml.GNMFDMLTest
>>> Running
org.apache.sysml.test.integration.applications.dml.WelchTDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.283
>> sec - in
org.apache.sysml.test.integration.applications.dml.WelchTDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.
>> ApplyTransformDMLTest
>>> Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.348
>> sec - in org.apache.sysml.test.integration.applications.dml.
>> ApplyTransformDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.ID3PyDMLTest
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 332.154
>> sec - in
org.apache.sysml.test.integration.applications.pydml.ID3PyDMLTest
>>> Running org.apache.sysml.test.integration.applications.
>> pydml.GNMFPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 56.585
>> sec - in org.apache.sysml.test.integration.applications.
>> pydml.GNMFPyDMLTest
>>>
>>> Results :
>>>
>>> Failed tests:
>>>  RightIndexingMatrixTest>AutomatedTestBase.tearDown:1472 Detailed
>> matrices characteristics:
>>> !  B-DML<->B-R # stored values in B-R: 3
>>> !  B-DML<->B-R # stored values in B-DML: 3
>>> !  B-DML<->B-R identical values(z=0): 0
>>> !  B-DML<->B-R wrong values(z=1.0E-10): 3
>>> !  B-DML<->B-R min error: 38.93887198928196
>>> !  B-DML<->B-R max error: 95.54525202237663
>>>    C-DML<->C-R # stored values in C-R: 545360
>>>    C-DML<->C-R # stored values in C-DML: 545360
>>>    C-DML<->C-R identical values(z=0): 545315
>>>    C-DML<->C-R wrong values(z=1.0E-10): 0
>>>    C-DML<->C-R min error: 0.0
>>>    C-DML<->C-R max error: 1.4210854715202004E-14
>>>    D-DML<->D-R # stored values in D-R: 6772
>>>    D-DML<->D-R # stored values in D-DML: 6772
>>>    D-DML<->D-R identical values(z=0): 6770
>>>    D-DML<->D-R wrong values(z=1.0E-10): 0
>>>    D-DML<->D-R min error: 0.0
>>>    D-DML<->D-R max error: 7.105427357601002E-15
>>>
>>>
>>> Tests run: 6541, Failures: 1, Errors: 0, Skipped: 0
>>>
>>> [INFO]
>>> [INFO] --- maven-failsafe-plugin:2.17:verify (default) @ systemml ---
>>> [INFO] Failsafe report directory: <https://sparktc.ibmcloud.com/
>> jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports>
>>> [INFO] ------------------------------------------------------------
>> ------------
>>> [INFO] BUILD FAILURE
>>> [INFO] ------------------------------------------------------------
>> ------------
>>> [INFO] Total time: 02:54 h
>>> [INFO] Finished at: 2017-01-20T17:30:11-06:00
>>> [INFO] Final Memory: 65M/2291M
>>> [INFO] ------------------------------------------------------------
>> ------------
>>> [ERROR] Failed to execute goal org.apache.maven.plugins:
>> maven-failsafe-plugin:2.17:verify (default) on project systemml: There
>> are test failures.
>>> [ERROR]
>>> [ERROR] Please refer to <https://sparktc.ibmcloud.com/
>> jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports> for the
>> individual test results.
>>> [ERROR] -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
the
>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>>> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
>> MojoFailureException
>>> Build step 'Execute shell' marked build as failure
>>> Run condition [Always] enabling perform for step [[]]
>>
>
>
>




Mime
  • Unnamed multipart/related (inline, None, 0 bytes)
View raw message