hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bejoy KS" <bejoy.had...@gmail.com>
Subject Re: hadoop memory settings
Date Fri, 05 Oct 2012 11:08:22 GMT
Hi Sadak

AFAIK HADOOP_HEAPSIZE determines the jvm size of the daemons like NN,JT,TT,DN etc.

 mapred.child.java.opts and mapred.child.ulimit is used to set the jvm heap for child jvms
launched for each map/reduce task launched.

Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <visioner.sadak@gmail.com>
Date: Fri, 5 Oct 2012 13:47:24 
To: <user@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: hadoop memory settings

coz i m getting Error occurred during initialization of VM hadoop
java.lang.Throwable: Child Error At org.apache.hadoop.mapred.TaskRunner.run
whe running a job.....:)

On Fri, Oct 5, 2012 at 1:39 PM, Visioner Sadak <visioner.sadak@gmail.com>wrote:

> Is ther a relation between HADOOP_HEAPSIZE mapred.child.java.opts and
> mapred.child.ulimit settings in hadoop-env.sh and mapred-site.xml i have a
> sinngle machine with 2gb ram and running hadoop on psuedo distr mode my
> HADOOP_HEAPSIZE is set to 256 wat shud i set mapred.child.java.opts and
> mapred.child.ulimit and how these settings are calculated if my ram is
> incresed or machine clusters are increased

Mime
View raw message