hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From shashwat shriparv <dwivedishash...@gmail.com>
Subject Re: Best way handle hadoop Java heap size space
Date Fri, 15 Mar 2013 09:58:32 GMT
If you just want to see to utilization and monitor it for some time try

Java Visual JVM
or
JConsol

and connect to JMX port . it will show you the heap uses and other  details
too.

These tools you can find in jdk bin folder





∞
Shashwat Shriparv



On Tue, Feb 5, 2013 at 10:34 PM, Dhanasekaran Anbalagan
<bugcy013@gmail.com>wrote:

> Hi Guys,
>
> We have configured many Heap size related thing. in Hadoop for ex.
>
> Namenode's Java Heap Size in bytes.
> Secondary namenode's Java Heap Size in bytes.
> Balancer's Java Heap Size in bytes.
> HttpFS's Java Heap Size in bytes.
> Failover Controller's Java Heap Size in bytes.
> MapReduce Child Java Maximum Heap Size
> Reduce Task Maximum Heap Siz
> TaskTracker's Java Heap Size in bytes.
>
> When i start my cluster these guys occupy the large number of memory, Best
> way handle or fine tune these parameter. please guide me.
>
> Bes monitor those configuration.
>
> Any graphical tool to help us to monitor?
>
> -Dhanasekaran.
>
>
> Did I learn something today? If not, I wasted it.
>

Mime
View raw message