spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: Set the number/memory of workers under mesos
Date Fri, 20 Jun 2014 20:40:09 GMT
You should be able to configure in spark context in Spark shell.
spark.cores.max & memory.
Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Fri, Jun 20, 2014 at 4:30 PM, Shuo Xiang <shuoxiangpub@gmail.com> wrote:

> Hi, just wondering anybody knows how to set up the number of workers (and
> the amount of memory) in mesos, while lauching spark-shell? I was trying to
> edit conf/spark-env.sh and it looks like that the environment variables are
> for YARN of standalone. Thanks!
>
>
>
>

Mime
View raw message