hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mapred Learn <mapred.le...@gmail.com>
Subject Re: how to set huge memory for reducer in streaming
Date Sun, 29 Jul 2012 08:25:22 GMT
Hi Harsh,
I tried all these but still fails.


Sent from my iPhone

On Jul 29, 2012, at 1:23 AM, Harsh J <harsh@cloudera.com> wrote:

> Hi,
> 
> You may raise your heap size via mapred.child.java.opts (or
> mapred.reduce.child.java.opts for reducers alone), and further raise
> the virtual-mem
> limit via mapred.child.ulimit (try setting it to 2x or 3x the heap
> size, in KB, or higher). I think its the latter you're running out
> with, since there's a subprocess involved.
> 
> Let us know if that helps.
> 
> On Sun, Jul 29, 2012 at 1:47 PM, Mapred Learn <mapred.learn@gmail.com> wrote:
>> hi,
>> One of my programs create a huge python dictionary and reducers fails with
>> Memory Error everytime.
>> 
>> Is there a way to specify reducer memory to be a bigger value for reducers
>> to succeed ?
>> 
>> I know we shuold not have this requirement in first place and not cerate
>> this kind of dictionary, but still can I finish this job with giving more
>> memory in jar command ?
>> 
>> 
>> Thanks,
>> JJ
>> 
> 
> 
> 
> -- 
> Harsh J

Mime
View raw message