spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James <alcaid1...@gmail.com>
Subject Re: How to avoid using some nodes while running a spark program on yarn
Date Sat, 14 Mar 2015 09:57:44 GMT
My hadoop version is 2.2.0, and my spark version is 1.2.0

2015-03-14 17:22 GMT+08:00 Ted Yu <yuzhihong@gmail.com>:

> Which release of hadoop are you using ?
>
> Can you utilize node labels feature ?
> See YARN-2492 and YARN-796
>
> Cheers
>
> On Sat, Mar 14, 2015 at 1:49 AM, James <alcaid1801@gmail.com> wrote:
>
>> Hello,
>>
>> I am got a cluster with spark on yarn. Currently some nodes of it are
>> running a spark streamming program, thus their local space is not enough to
>> support other application. Thus I wonder is that possible to use a
>> blacklist to avoid using these nodes when running a new spark program?
>>
>> Alcaid
>>
>
>

Mime
View raw message