hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alvaro Brandon <alvarobran...@gmail.com>
Subject Choosing a subset of machines to launch Spark Application
Date Tue, 07 Feb 2017 10:15:01 GMT
Hello all:

I have the following scenario.
- I have a cluster of 50 machines with Hadoop and Spark installed on them.
- I want to launch one Spark application through spark submit. However I
want this application to run on only a subset of these machines,
disregarding data locality. (e.g. 10 machines)

Is this possible?. Is there any option in YARN that allows such thing?.

View raw message