spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nan Zhu <zhunanmcg...@gmail.com>
Subject Re: How to use more executors
Date Wed, 21 Jan 2015 23:57:19 GMT
…not sure when will it be reviewed…

but for now you can work around by allowing multiple worker instances on a single machine
 

http://spark.apache.org/docs/latest/spark-standalone.html

search SPARK_WORKER_INSTANCES

Best,  

--  
Nan Zhu
http://codingcat.me


On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:

> Will  SPARK-1706 be included in next release?
>  
> On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu <yuzhihong@gmail.com (mailto:yuzhihong@gmail.com)>
wrote:
> > Please see SPARK-1706
> >  
> > On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu <larryliu05@gmail.com (mailto:larryliu05@gmail.com)>
wrote:
> > > I tried to submit a job with  --conf "spark.cores.max=6"  or --total-executor-cores
6 on a standalone cluster. But I don't see more than 1 executor on each worker. I am wondering
how to use multiple executors when submitting jobs.
> > >  
> > > Thanks
> > > larry
> > >  
> > >  
> >  
> >  
> >  
>  


Mime
View raw message