hadoop-mapreduce-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "zhihai xu (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MAPREDUCE-6696) Add a configuration to limit the number of map tasks allowed per job.
Date Thu, 19 May 2016 16:01:12 GMT

    [ https://issues.apache.org/jira/browse/MAPREDUCE-6696?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15291366#comment-15291366
] 

zhihai xu commented on MAPREDUCE-6696:
--------------------------------------

The test failures are not related to my change. It is already reported at https://issues.apache.org/jira/browse/MAPREDUCE-6702

> Add a configuration to limit the number of map tasks allowed per job.
> ---------------------------------------------------------------------
>
>                 Key: MAPREDUCE-6696
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-6696
>             Project: Hadoop Map/Reduce
>          Issue Type: Improvement
>          Components: job submission
>    Affects Versions: 2.8.0
>            Reporter: zhihai xu
>            Assignee: zhihai xu
>         Attachments: MAPREDUCE-6696.000.patch, MAPREDUCE-6696.001.patch, MAPREDUCE-6696.002.patch,
MAPREDUCE-6696.003.patch
>
>
> Add a configuration "mapreduce.job.max.map" to limit the number of map tasks allowed
per job. It will be useful for Hadoop admin to save Hadoop cluster resource by preventing
users from submitting big mapreduce jobs. A mapredeuce job with too many mappers may fail
with OOM after running for long time. It will be a big waste.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: mapreduce-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: mapreduce-issues-help@hadoop.apache.org


Mime
View raw message