hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Mawata <chris.maw...@gmail.com>
Subject Re: Max Parallel task executors
Date Fri, 06 Nov 2015 12:27:29 GMT
Also check that you have more than 31 blocks to process.
On Nov 6, 2015 6:54 AM, "sandeep das" <yarnhadoop@gmail.com> wrote:

> Hi Varun,
>
> I tried to increase this parameter but it did not increase number of
> parallel tasks but if It is decreased then YARN reduces number of parallel
> tasks. I'm bit puzzled why its not increasing more than 31 tasks even after
> its value is increased.
>
> Is there any other configuration as well which controls on how many
> maximum tasks can execute in parallel?
>
> Regards,
> Sandeep
>
> On Tue, Nov 3, 2015 at 7:29 PM, Varun Vasudev <vvasudev@apache.org> wrote:
>
>> The number of parallel tasks that are run depends on the amount of memory
>> and vcores on your machines and the amount of memory and vcores required by
>> your mappers and reducers. The amount of memory can be set
>> via yarn.nodemanager.resource.memory-mb(the default is 8G). The amount of
>> vcores can be set via yarn.nodemanager.resource.cpu-vcores(the default
>> is 8 vcores).
>>
>> -Varun
>>
>> From: sandeep das <yarnhadoop@gmail.com>
>> Reply-To: <user@hadoop.apache.org>
>> Date: Monday, November 2, 2015 at 3:56 PM
>> To: <user@hadoop.apache.org>
>> Subject: Max Parallel task executors
>>
>> Hi Team,
>>
>> I've a cloudera cluster of 4 nodes. Whenever i submit a job my only 31
>> parallel tasks are executed whereas my machines have more CPU available but
>> still YARN/AM does not create more task.
>>
>> Is there any configuration which I can change to start more MAP/REDUCER
>> task in parallel?
>>
>> Each machine in my cluster has 24 CPUs.
>>
>> Regards,
>> Sandeep
>>
>
>

Mime
View raw message