spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From andrewor14 <>
Subject [GitHub] spark pull request: SPARK-1706: Allow multiple executors per worke...
Date Tue, 07 Apr 2015 21:53:22 GMT
Github user andrewor14 commented on the pull request:
    @CodingCat Thanks for the latest changes. It is much simpler and I believe it does what
we want!
    On a separate note, I had an offline discussion with @pwendell about the config semantics.
He actually proposes that we configure the number of cores an executor will have exactly,
rather than the maximum number cores it could have. Meaning, instead of having `spark.deploy.maxCoresPerExecutor`,
we will reuse `spark.executor.cores` as suggested before, but modify the code a little to
make sure each executor has exactly N cores instead of at most N cores (where N is the value
of `spark.executor.cores`). I will make more suggestions inline to indicate what I mean.

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message