hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arun C Murthy <...@hortonworks.com>
Subject Re: Hadoop YARN 2.2.0 Streaming Memory Limitation
Date Mon, 24 Feb 2014 22:49:44 GMT
Can you pls try with mapreduce.map.memory.mb = 5124 & mapreduce.map.child.java.opts="-Xmx1024"
? 

This way the map jvm gets 1024 and 4G is available for the container.

Hope that helps.

Arun

On Feb 24, 2014, at 1:27 AM, Patrick Boenzli <patrick.boenzli@soom-it.ch> wrote:

> hello hadoop-users!
> 
> We are currently facing a frustrating hadoop streaming memory problem. our setup:
> 
> our compute nodes have about 7 GB of RAM
> hadoop streaming starts a bash script wich uses about 4 GB of RAM
> therefore it is only possible to start one and only one task per node
> out of the box each hadoop instance starts about 7 hadoop containers with default hadoop
settings. each hadoop task forks a bash script that need about 4 GB of RAM, the first fork
works, all following fail because they run out of memory. so what we are looking for is to
limit the number of containers to only one. so what we found on the internet:
> 
> yarn.scheduler.maximum-allocation-mb and mapreduce.map.memory.mb is set to values such
that there is at most one container. this means, mapreduce.map.memory.mb must be more than
half of the maximum memory (otherwise there will be multiple containers).
> done right, this gives us one container per node. but it produces a new problem: since
our java process is now using at least half of the max memory, our child (bash) process we
fork will inherit the parent memory footprint and since the memory used by our parent was
more than half of total memory, we run out of memory again. if we lower the map memory, hadoop
will allocate 2 containers per node, which will run out of memory too.
> 
> since this problem is a blocker in our current project we are evaluating adapting the
source code to solve this issue. as a last resort. any ideas on this are very much welcome.
> 
> we would be very happy for any help offered! 
> Thanks!
> 
> 
> PS: We asked this question also on stackoverflow three days ago (http://stackoverflow.com/questions/21933937/hadoop-2-2-0-streaming-memory-limitation).
no answer yet. If there should be any answers in one of the forms we will sync the answers.

--
Arun C. Murthy
Hortonworks Inc.
http://hortonworks.com/



-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Mime
View raw message