hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eric Fiala <e...@fiala.ca>
Subject Re: Mapreduce heap size error
Date Sun, 13 Nov 2011 23:34:46 GMT
Hoot, these are big numbers - some thoughts
1) does your machine have 1000GB to spare for each java child thread (each
mapper + each reducer)?  mapred.child.java.opts / -Xmx1048576m
2) does each of your daemons need / have 10G? HADOOP_HEAPSIZE=10000

hth
EF

> # The maximum amount of heap to use, in MB. Default is 1000.
>  export HADOOP_HEAPSIZE=10000
>
> <name>mapred.child.java.opts</name>
> <value>-Xmx1048576m</value>
> </property>
>
>

Mime
View raw message