hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Manoj Babu <manoj...@gmail.com>
Subject Re: JAVA heap error for the tasks in mapreduce
Date Wed, 20 Mar 2013 06:48:34 GMT
Hi,

you have to increase the upper limit also check mapred.child.ulimit
property.

Cheers!
Manoj.


On Wed, Mar 20, 2013 at 12:10 PM, Abhishek Shivkumar <
abhisheksgumadi@gmail.com> wrote:

> Hi,
>
>     I have a setup() method in the Mapper.java class where I am reading in
> a 1.6 GB HashMap that was serialized into a file and stored in HDFS. When I
> am running the job, it gets stuck at the readobject() method that reads
> this serialized file into a HashMap.
>
> I increased the heap size both by doing export HADOOP_HEAPSIZE=4096 and
> also writing *conf.set("mapred.map.child.opts", "-Xmx4096M);* and *conf.set("mapred.reduce.child.opts",
> "-Xmx4096M);*
> *
> *
> It still doesn't help. Should we do something else? If I enter the
> HADOOP_HEAPSIZE beyond this, it doesn't run the hadoop command and fails to
> instantiate a JVM.
>
> Any comments would be appreciated!
>
> Thank you!
>
> With Regards,
> Abhishek S
>

Mime
View raw message