systemml-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matthias Boehm <mboe...@googlemail.com>
Subject Re: Build failed in Jenkins: SystemML-DailyTest #805
Date Sun, 12 Feb 2017 19:12:05 GMT
thanks Deron - that helped. As it turned out, these failures only show 
up in environments with many virtual cores where individual file splits 
become so small that some of them only contain meta data.

Regards,
Matthias

On 2/12/2017 8:46 AM, Deron Eriksson wrote:
> Using the latest code on master, these tests appear to run correctly on OS
> X.
>
> $ mvn test -Dtest=TransformCSVFrameEncodeReadTest
>
> -------------------------------------------------------
>  T E S T S
> -------------------------------------------------------
> Running
> org.apache.sysml.test.integration.functions.transform.TransformCSVFrameEncodeReadTest
> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.677 sec
> - in
> org.apache.sysml.test.integration.functions.transform.TransformCSVFrameEncodeReadTest
>
>
>
>
> On Sun, Feb 12, 2017 at 12:26 AM, Matthias Boehm <mboehm7@googlemail.com>
> wrote:
>
>> could someone please test TransformCSVFrameEncodeReadTest on another
>> platform? In my local environment, this test runs perfectly fine - indeed,
>> I just added this test for SYSTEMML-1244.
>>
>> Regards,
>> Matthias
>>
>>
>> On 2/11/2017 3:30 PM, jenkins@spark.tc wrote:
>>
>>> See <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest
>>> /805/changes>
>>>
>>> Changes:
>>>
>>> [Matthias Boehm] [SYSTEMML-1244] Fix robustness csv text read (quoted
>>> recoded maps)
>>>
>>> [Matthias Boehm] [SYSTEMML-1243] Fix size update wdivmm/wsigmoid/wumm on
>>> rewrite
>>>
>>> [Matthias Boehm] [SYSTEMML-1248] Fix loop rewrite update-in-place
>>> (exclude local vars)
>>>
>>> [Matthias Boehm] [SYSTEMML-1249] Deprecate parfor perftesttool and
>>> cleanup unused code
>>>
>>> ------------------------------------------
>>> [...truncated 12436 lines...]
>>> 17/02/11 15:52:57 INFO memory.MemoryStore: Block broadcast_242 stored as
>>> values in memory (estimated size 3.9 KB, free 1045.5 MB)
>>> 17/02/11 15:52:57 INFO memory.MemoryStore: Block broadcast_242_piece0
>>> stored as bytes in memory (estimated size 2.2 KB, free 1045.5 MB)
>>> 17/02/11 15:52:57 INFO storage.BlockManagerInfo: Added
>>> broadcast_242_piece0 in memory on 169.54.146.43:54081 (size: 2.2 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:57 INFO spark.SparkContext: Created broadcast 242 from
>>> broadcast at DAGScheduler.scala:996
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
>>> from ShuffleMapStage 242 (MapPartitionsRDD[663] at flatMapToPair at
>>> RDDConverterUtils.java:273)
>>> 17/02/11 15:52:57 INFO scheduler.TaskSchedulerImpl: Adding task set 242.0
>>> with 1 tasks
>>> 17/02/11 15:52:57 INFO scheduler.FairSchedulableBuilder: Added task set
>>> TaskSet_242.0 tasks to pool default
>>> 17/02/11 15:52:57 INFO scheduler.TaskSetManager: Starting task 0.0 in
>>> stage 242.0 (TID 252, localhost, executor driver, partition 0,
>>> PROCESS_LOCAL, 6218 bytes)
>>> 17/02/11 15:52:57 INFO executor.Executor: Running task 0.0 in stage 242.0
>>> (TID 252)
>>> 17/02/11 15:52:57 INFO executor.Executor: Finished task 0.0 in stage
>>> 242.0 (TID 252). 1253 bytes result sent to driver
>>> 17/02/11 15:52:57 INFO scheduler.TaskSetManager: Finished task 0.0 in
>>> stage 242.0 (TID 252) in 14 ms on localhost (executor driver) (1/1)
>>> 17/02/11 15:52:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>>> 242.0, whose tasks have all completed, from pool default
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: ShuffleMapStage 242
>>> (flatMapToPair at RDDConverterUtils.java:273) finished in 0.014 s
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: looking for newly runnable
>>> stages
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: running: Set()
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: waiting: Set(ResultStage
>>> 243)
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: failed: Set()
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: Submitting ResultStage 243
>>> (MapPartitionsRDD[669] at collectAsList at MLContextTest.java:1337), which
>>> has no missing parents
>>> 17/02/11 15:52:57 INFO memory.MemoryStore: Block broadcast_243 stored as
>>> values in memory (estimated size 11.3 KB, free 1045.4 MB)
>>> 17/02/11 15:52:57 INFO memory.MemoryStore: Block broadcast_243_piece0
>>> stored as bytes in memory (estimated size 5.6 KB, free 1045.4 MB)
>>> 17/02/11 15:52:57 INFO storage.BlockManagerInfo: Added
>>> broadcast_243_piece0 in memory on 169.54.146.43:54081 (size: 5.6 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:57 INFO spark.SparkContext: Created broadcast 243 from
>>> broadcast at DAGScheduler.scala:996
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
>>> from ResultStage 243 (MapPartitionsRDD[669] at collectAsList at
>>> MLContextTest.java:1337)
>>> 17/02/11 15:52:57 INFO scheduler.TaskSchedulerImpl: Adding task set 243.0
>>> with 1 tasks
>>> 17/02/11 15:52:57 INFO scheduler.FairSchedulableBuilder: Added task set
>>> TaskSet_243.0 tasks to pool default
>>> 17/02/11 15:52:57 INFO scheduler.TaskSetManager: Starting task 0.0 in
>>> stage 243.0 (TID 253, localhost, executor driver, partition 0, ANY, 5783
>>> bytes)
>>> 17/02/11 15:52:57 INFO executor.Executor: Running task 0.0 in stage 243.0
>>> (TID 253)
>>> 17/02/11 15:52:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1
>>> non-empty blocks out of 1 blocks
>>> 17/02/11 15:52:57 INFO storage.ShuffleBlockFetcherIterator: Started 0
>>> remote fetches in 0 ms
>>> 17/02/11 15:52:57 INFO executor.Executor: Finished task 0.0 in stage
>>> 243.0 (TID 253). 2024 bytes result sent to driver
>>> 17/02/11 15:52:57 INFO scheduler.TaskSetManager: Finished task 0.0 in
>>> stage 243.0 (TID 253) in 9 ms on localhost (executor driver) (1/1)
>>> 17/02/11 15:52:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>>> 243.0, whose tasks have all completed, from pool default
>>> 17/02/11 15:52:57 INFO scheduler.DAGScheduler: ResultStage 243
>>> (collectAsList at MLContextTest.java:1337) finished in 0.010 s
>>> 17/02/11 15:52:58 INFO storage.BlockManagerInfo: Removed
>>> broadcast_242_piece0 on 169.54.146.43:54081 in memory (size: 2.2 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:58 INFO storage.BlockManagerInfo: Removed
>>> broadcast_243_piece0 on 169.54.146.43:54081 in memory (size: 5.6 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:58 INFO spark.ContextCleaner: Cleaned accumulator 16145
>>> 17/02/11 15:52:58 INFO spark.ContextCleaner: Cleaned shuffle 62
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Registering RDD 671
>>> (flatMapToPair at RDDConverterUtils.java:273)
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Got job 181 (collectAsList
>>> at MLContextTest.java:1225) with 1 output partitions
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Final stage: ResultStage
>>> 245 (collectAsList at MLContextTest.java:1225)
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Parents of final stage:
>>> List(ShuffleMapStage 244)
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Missing parents:
>>> List(ShuffleMapStage 244)
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage
>>> 244 (MapPartitionsRDD[671] at flatMapToPair at RDDConverterUtils.java:273),
>>> which has no missing parents
>>> 17/02/11 15:52:58 INFO memory.MemoryStore: Block broadcast_244 stored as
>>> values in memory (estimated size 3.9 KB, free 1045.5 MB)
>>> 17/02/11 15:52:58 INFO memory.MemoryStore: Block broadcast_244_piece0
>>> stored as bytes in memory (estimated size 2.2 KB, free 1045.5 MB)
>>> 17/02/11 15:52:58 INFO storage.BlockManagerInfo: Added
>>> broadcast_244_piece0 in memory on 169.54.146.43:54081 (size: 2.2 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:58 INFO spark.SparkContext: Created broadcast 244 from
>>> broadcast at DAGScheduler.scala:996
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
>>> from ShuffleMapStage 244 (MapPartitionsRDD[671] at flatMapToPair at
>>> RDDConverterUtils.java:273)
>>> 17/02/11 15:52:58 INFO scheduler.TaskSchedulerImpl: Adding task set 244.0
>>> with 1 tasks
>>> 17/02/11 15:52:58 INFO scheduler.FairSchedulableBuilder: Added task set
>>> TaskSet_244.0 tasks to pool default
>>> 17/02/11 15:52:58 INFO scheduler.TaskSetManager: Starting task 0.0 in
>>> stage 244.0 (TID 254, localhost, executor driver, partition 0,
>>> PROCESS_LOCAL, 6218 bytes)
>>> 17/02/11 15:52:58 INFO executor.Executor: Running task 0.0 in stage 244.0
>>> (TID 254)
>>> 17/02/11 15:52:58 INFO executor.Executor: Finished task 0.0 in stage
>>> 244.0 (TID 254). 1253 bytes result sent to driver
>>> 17/02/11 15:52:58 INFO scheduler.TaskSetManager: Finished task 0.0 in
>>> stage 244.0 (TID 254) in 7 ms on localhost (executor driver) (1/1)
>>> 17/02/11 15:52:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>>> 244.0, whose tasks have all completed, from pool default
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: ShuffleMapStage 244
>>> (flatMapToPair at RDDConverterUtils.java:273) finished in 0.007 s
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: looking for newly runnable
>>> stages
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: running: Set()
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: waiting: Set(ResultStage
>>> 245)
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: failed: Set()
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Submitting ResultStage 245
>>> (MapPartitionsRDD[677] at collectAsList at MLContextTest.java:1225), which
>>> has no missing parents
>>> 17/02/11 15:52:58 INFO memory.MemoryStore: Block broadcast_245 stored as
>>> values in memory (estimated size 11.3 KB, free 1045.4 MB)
>>> 17/02/11 15:52:58 INFO memory.MemoryStore: Block broadcast_245_piece0
>>> stored as bytes in memory (estimated size 5.6 KB, free 1045.4 MB)
>>> 17/02/11 15:52:58 INFO storage.BlockManagerInfo: Added
>>> broadcast_245_piece0 in memory on 169.54.146.43:54081 (size: 5.6 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:58 INFO spark.SparkContext: Created broadcast 245 from
>>> broadcast at DAGScheduler.scala:996
>>> 17/02/11 15:52:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
>>> from ResultStage 245 (MapPartitionsRDD[677] at collectAsList at
>>> MLContextTest.java:1225)
>>> 17/02/11 15:52:58 INFO scheduler.TaskSchedulerImpl: Adding task set 245.0
>>> with 1 tasks
>>> 17/02/11 15:52:58 INFO scheduler.FairSchedulableBuilder: Added task set
>>> TaskSet_245.0 tasks to pool default
>>> 17/02/11 15:52:58 INFO scheduler.TaskSetManager: Starting task 0.0 in
>>> stage 245.0 (TID 255, localhost, executor driver, partition 0, ANY, 5783
>>> bytes)
>>> 17/02/11 15:52:58 INFO executor.Executor: Running task 0.0 in stage 245.0
>>> (TID 255)
>>> 17/02/11 15:52:58 INFO storage.ShuffleBlockFetcherIterator: Getting 1
>>> non-empty blocks out of 1 blocks
>>> 17/02/11 15:52:58 INFO storage.ShuffleBlockFetcherIterator: Started 0
>>> remote fetches in 0 ms
>>> 17/02/11 15:52:59 INFO executor.Executor: Finished task 0.0 in stage
>>> 245.0 (TID 255). 2024 bytes result sent to driver
>>> 17/02/11 15:52:59 INFO scheduler.TaskSetManager: Finished task 0.0 in
>>> stage 245.0 (TID 255) in 9 ms on localhost (executor driver) (1/1)
>>> 17/02/11 15:52:59 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
>>> 245.0, whose tasks have all completed, from pool default
>>> 17/02/11 15:52:59 INFO scheduler.DAGScheduler: ResultStage 245
>>> (collectAsList at MLContextTest.java:1225) finished in 0.009 s
>>> 17/02/11 15:52:59 INFO storage.BlockManagerInfo: Removed
>>> broadcast_244_piece0 on 169.54.146.43:54081 in memory (size: 2.2 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:59 INFO storage.BlockManagerInfo: Removed
>>> broadcast_245_piece0 on 169.54.146.43:54081 in memory (size: 5.6 KB,
>>> free: 1045.8 MB)
>>> 17/02/11 15:52:59 INFO spark.ContextCleaner: Cleaned accumulator 16242
>>> 17/02/11 15:52:59 INFO spark.ContextCleaner: Cleaned shuffle 63
>>> 17/02/11 15:53:00 INFO server.ServerConnector: Stopped
>>> ServerConnector@4651fa3a{HTTP/1.1}{0.0.0.0:4040}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@f2ae5b3{/stages/stage/kill,nul
>>> l,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@7f259849{/jobs/job/kill,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@1fde9c29{/api,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@396102c0{/,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@4cd79513{/static,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@1dbfb497{/executors/threadDump
>>> /json,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@5b5fb8d5{/executors/threadDump
>>> ,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@2eab6d04{/executors/json,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@7764dba8{/executors,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@31aef118{/environment/json,nul
>>> l,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@ecf79d5{/environment,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@53614efe{/storage/rdd/json,nul
>>> l,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@2e4d1835{/storage/rdd,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@3da05fd3{/storage/json,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@65471699{/storage,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@51c58280{/stages/pool/json,nul
>>> l,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@5d09deec{/stages/pool,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@216f3a18{/stages/stage/json,nu
>>> ll,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@27b27099{/stages/stage,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@7a7ec7da{/stages/json,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@230be223{/stages,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@22f485e2{/jobs/job/json,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@1d980125{/jobs/job,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@5c00a595{/jobs/json,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO handler.ContextHandler: Stopped
>>> o.s.j.s.ServletContextHandler@15929d92{/jobs,null,UNAVAILABLE}
>>> 17/02/11 15:53:00 INFO ui.SparkUI: Stopped Spark web UI at
>>> http://169.54.146.43:4040
>>> 17/02/11 15:53:00 INFO spark.MapOutputTrackerMasterEndpoint:
>>> MapOutputTrackerMasterEndpoint stopped!
>>> 17/02/11 15:53:00 INFO memory.MemoryStore: MemoryStore cleared
>>> 17/02/11 15:53:00 INFO storage.BlockManager: BlockManager stopped
>>> 17/02/11 15:53:00 INFO storage.BlockManagerMaster: BlockManagerMaster
>>> stopped
>>> 17/02/11 15:53:00 INFO scheduler.OutputCommitCoordina
>>> tor$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
>>> 17/02/11 15:53:00 INFO spark.SparkContext: Successfully stopped
>>> SparkContext
>>> Running org.apache.sysml.test.integration.mlcontext.MLContextTest
>>> Tests run: 165, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.812
>>> sec - in org.apache.sysml.test.integration.mlcontext.MLContextTest
>>> 17/02/11 15:53:00 INFO util.ShutdownHookManager: Shutdown hook called
>>> 17/02/11 15:53:00 INFO util.ShutdownHookManager: Deleting directory
>>> /tmp/spark-c4727405-fd48-4b37-a090-6ebfeac3d23a
>>> Running org.apache.sysml.test.integration.applications.dml.ArimaDMLTest
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 201.288
>>> sec - in org.apache.sysml.test.integration.applications.dml.ArimaDMLTest
>>> Running org.apache.sysml.test.integration.functions.append.AppendVec
>>> torTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 351.825
>>> sec - in org.apache.sysml.test.integration.functions.append.AppendVec
>>> torTest
>>> Running org.apache.sysml.test.integration.functions.indexing.
>>> LeftIndexingTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 677.408
>>> sec - in org.apache.sysml.test.integration.functions.indexing.
>>> LeftIndexingTest
>>> Running org.apache.sysml.test.integration.functions.quaternary.
>>> WeightedSigmoidTest
>>> Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 999.246
>>> sec - in org.apache.sysml.test.integration.functions.quaternary.
>>> WeightedSigmoidTest
>>> 17/02/11 15:55:47 INFO util.ShutdownHookManager: Shutdown hook called
>>> 17/02/11 15:55:47 INFO util.ShutdownHookManager: Deleting directory
>>> /tmp/spark-6737dbba-1df7-4ce6-83be-c0c25ba88bed
>>> Running org.apache.sysml.test.integration.applications.parfor.
>>> ParForCorrelationTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 275.784
>>> sec - in org.apache.sysml.test.integration.applications.parfor.
>>> ParForCorrelationTest
>>> Running org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateUnweightedScaleDenseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 345.463
>>> sec - in org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateUnweightedScaleDenseTest
>>> Running org.apache.sysml.test.integration.applications.parfor.ParFor
>>> CorrelationTestLarge
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.574
>>> sec - in org.apache.sysml.test.integration.applications.parfor.ParFor
>>> CorrelationTestLarge
>>> Running org.apache.sysml.test.integration.functions.reorg.FullOrderTest
>>> Tests run: 132, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>>> 1,349.944 sec - in org.apache.sysml.test.integrat
>>> ion.functions.reorg.FullOrderTest
>>> Running org.apache.sysml.test.integration.functions.quaternary.Weigh
>>> tedDivMatrixMultTest
>>> Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>>> 1,487.774 sec - in org.apache.sysml.test.integrat
>>> ion.functions.quaternary.WeightedDivMatrixMultTest
>>> 17/02/11 16:02:11 INFO util.ShutdownHookManager: Shutdown hook called
>>> 17/02/11 16:02:11 INFO util.ShutdownHookManager: Deleting directory
>>> /tmp/spark-bc9fb39e-e7b6-42e2-9a98-ad59f1c8821c
>>> Running org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateWeightedScaleDenseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 363.363
>>> sec - in org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateWeightedScaleDenseTest
>>> Running org.apache.sysml.test.integration.applications.parfor.
>>> ParForNaiveBayesTest
>>> Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 535.047
>>> sec - in org.apache.sysml.test.integration.applications.parfor.
>>> ParForNaiveBayesTest
>>> Running org.apache.sysml.test.integration.functions.append.AppendCha
>>> inTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>>> 1,067.301 sec - in org.apache.sysml.test.integrat
>>> ion.functions.append.AppendChainTest
>>> Running org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateUnweightedScaleSparseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 289.784
>>> sec - in org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateUnweightedScaleSparseTest
>>> Running org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateWeightedScaleSparseTest
>>> Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 345.256
>>> sec - in org.apache.sysml.test.integration.applications.descriptivest
>>> ats.UnivariateWeightedScaleSparseTest
>>> Running org.apache.sysml.test.integration.functions.append.AppendMat
>>> rixTest
>>> Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>>> 1,506.304 sec - in org.apache.sysml.test.integrat
>>> ion.functions.append.AppendMatrixTest
>>> Running org.apache.sysml.test.integration.applications.dml.HITSDMLTest
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 153.694
>>> sec - in org.apache.sysml.test.integration.applications.dml.HITSDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> ArimaPyDMLTest
>>> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.265
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> ArimaPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> HITSPyDMLTest
>>> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.771
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> HITSPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.LinearReg
>>> ressionDMLTest
>>> Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 102.237
>>> sec - in org.apache.sysml.test.integration.applications.dml.LinearReg
>>> ressionDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> CsplineDSPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.92 sec
>>> - in org.apache.sysml.test.integration.applications.pydml.
>>> CsplineDSPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> LinearLogRegPyDMLTest
>>> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.602
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> LinearLogRegPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> NaiveBayesPyDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.183
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> NaiveBayesPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.CsplineCG
>>> DMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.955
>>> sec - in org.apache.sysml.test.integration.applications.dml.CsplineCG
>>> DMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest
>>> Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 128.661
>>> sec - in org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.MDABiva
>>> riateStatsPyDMLTest
>>> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 295.879
>>> sec - in org.apache.sysml.test.integration.applications.pydml.MDABiva
>>> riateStatsPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> PageRankPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.356 sec
>>> - in org.apache.sysml.test.integration.applications.pydml.
>>> PageRankPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.NaiveBa
>>> yesParforPyDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.447
>>> sec - in org.apache.sysml.test.integration.applications.pydml.NaiveBa
>>> yesParforPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.NaiveBaye
>>> sDMLTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 163.427
>>> sec - in org.apache.sysml.test.integration.applications.dml.NaiveBaye
>>> sDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.MDABivari
>>> ateStatsDMLTest
>>> Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 611.362
>>> sec - in org.apache.sysml.test.integration.applications.dml.MDABivari
>>> ateStatsDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.PageRankD
>>> MLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.421
>>> sec - in org.apache.sysml.test.integration.applications.dml.PageRankD
>>> MLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.GLMPyDMLTest
>>> Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 588.773
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> GLMPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> MultiClassSVMPyDMLTest
>>> Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 88.624
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> MultiClassSVMPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.ID3DMLTest
>>> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 655.34
>>> sec - in org.apache.sysml.test.integration.applications.dml.ID3DMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.GLMDMLTest
>>> Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
>>> 1,474.998 sec - in org.apache.sysml.test.integrat
>>> ion.applications.dml.GLMDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> CsplineCGPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.287 sec
>>> - in org.apache.sysml.test.integration.applications.pydml.
>>> CsplineCGPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> WelchTPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.922 sec
>>> - in org.apache.sysml.test.integration.applications.pydml.WelchTPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.MultiClas
>>> sSVMDMLTest
>>> Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 179.191
>>> sec - in org.apache.sysml.test.integration.applications.dml.MultiClas
>>> sSVMDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.CsplineDS
>>> DMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.295
>>> sec - in org.apache.sysml.test.integration.applications.dml.CsplineDS
>>> DMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> ApplyTransformPyDMLTest
>>> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.434 sec
>>> - in org.apache.sysml.test.integration.applications.pydml.
>>> ApplyTransformPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.NaiveBaye
>>> sParforDMLTest
>>> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 162.783
>>> sec - in org.apache.sysml.test.integration.applications.dml.NaiveBaye
>>> sParforDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> L2SVMPyDMLTest
>>> Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.577
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> L2SVMPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.LinearR
>>> egressionPyDMLTest
>>> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.4 sec
>>> - in org.apache.sysml.test.integration.applications.pydml.LinearR
>>> egressionPyDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.LinearLog
>>> RegDMLTest
>>> Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 122.904
>>> sec - in org.apache.sysml.test.integration.applications.dml.LinearLog
>>> RegDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.GNMFDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 115.25
>>> sec - in org.apache.sysml.test.integration.applications.dml.GNMFDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.WelchTDMLTest
>>> Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.286
>>> sec - in org.apache.sysml.test.integration.applications.dml.WelchTDMLTest
>>> Running org.apache.sysml.test.integration.applications.dml.ApplyTran
>>> sformDMLTest
>>> Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.878
>>> sec - in org.apache.sysml.test.integration.applications.dml.ApplyTran
>>> sformDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.ID3PyDMLTest
>>> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 331.351
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> ID3PyDMLTest
>>> Running org.apache.sysml.test.integration.applications.pydml.
>>> GNMFPyDMLTest
>>> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.087
>>> sec - in org.apache.sysml.test.integration.applications.pydml.
>>> GNMFPyDMLTest
>>>
>>> Results :
>>>
>>> Tests in error:
>>>   TransformCSVFrameEncodeReadTest.testFrameParReadMetaHybridCSV:79->runTransformTest:122
>>> Runtime
>>>   TransformCSVFrameEncodeReadTest.testFrameParReadMetaSparkCSV:74->runTransformTest:122
>>> Runtime
>>>   TransformCSVFrameEncodeReadTest.testFrameParReadMetaSingleNodeCSV:69->runTransformTest:122
>>> Runtime
>>>
>>> Tests run: 6676, Failures: 0, Errors: 3, Skipped: 0
>>>
>>> [INFO]
>>> [INFO] --- maven-failsafe-plugin:2.17:verify (default) @ systemml ---
>>> [INFO] Failsafe report directory: <https://sparktc.ibmcloud.com/
>>> jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports>
>>> [INFO] ------------------------------------------------------------
>>> ------------
>>> [INFO] BUILD FAILURE
>>> [INFO] ------------------------------------------------------------
>>> ------------
>>> [INFO] Total time: 02:54 h
>>> [INFO] Finished at: 2017-02-11T17:30:34-06:00
>>> [INFO] Final Memory: 62M/2071M
>>> [INFO] ------------------------------------------------------------
>>> ------------
>>> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.17:verify
>>> (default) on project systemml: There are test failures.
>>> [ERROR]
>>> [ERROR] Please refer to <https://sparktc.ibmcloud.com/
>>> jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports> for the
>>> individual test results.
>>> [ERROR] -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1] http://cwiki.apache.org/conflu
>>> ence/display/MAVEN/MojoFailureException
>>> Build step 'Execute shell' marked build as failure
>>> Run condition [Always] enabling perform for step [[]]
>>>
>>>
>
>

Mime
View raw message