spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-9487) Use the same num. worker threads in Scala/Python unit tests
Date Sat, 10 Dec 2016 19:48:58 GMT

    [ https://issues.apache.org/jira/browse/SPARK-9487?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15738383#comment-15738383
] 

Sean Owen commented on SPARK-9487:
----------------------------------

Yes, that probably means the test changes aren't quite robust in their new form. Getting them
to pass locally and on Jenkins indicates they're at least general enough to pass across both
envs. And of course we have to get them to pass on Jenkins. It can be hard to debug; try a
different machine? try loosening conditions? you can push changes to a WIP PR to see how Jenkins
treats them. I think we need to bring this to a conclusion though. Right now I'm not clear
this solves enough of a problem to bother with, so I'm inclined to close it.

> Use the same num. worker threads in Scala/Python unit tests
> -----------------------------------------------------------
>
>                 Key: SPARK-9487
>                 URL: https://issues.apache.org/jira/browse/SPARK-9487
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, Spark Core, SQL, Tests
>    Affects Versions: 1.5.0
>            Reporter: Xiangrui Meng
>              Labels: starter
>         Attachments: ContextCleanerSuiteResults, HeartbeatReceiverSuiteResults
>
>
> In Python we use `local[4]` for unit tests, while in Scala/Java we use `local[2]` and
`local` for some unit tests in SQL, MLLib, and other components. If the operation depends
on partition IDs, e.g., random number generator, this will lead to different result in Python
and Scala/Java. It would be nice to use the same number in all unit tests.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message