hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Eric Yang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-10759) Remove hardcoded JAVA_HEAP_MAX in hadoop-config.sh
Date Wed, 06 Aug 2014 06:43:12 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-10759?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14087330#comment-14087330
] 

Eric Yang commented on HADOOP-10759:
------------------------------------

The information in ZOOKEEPER-1670 is not entirely accurate.  Java does a good job on calculate
initial heap size, and it will use 1/4th of machine memory up to 1GB.  See:

http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html#par_gc.ergonomics.default_size

Therefore, without this being specified, it may use up to 1GB for heap on machine that has
greater than 4GB physical memory.  However, for smaller machine such as a virtual machine,
it would be nicer if it can scale dynamically.  Another benefit of removing this hard coded
value is to make sure that the Hadoop command line is not capped to 1GB for trivial operation
such as GetConf, or dfs client operation to reduce memory starvation in executing too many
cli operations in parallel to map reduce tasks. We have notice that while the machine may
already hand out most memory to map reduce tasks, and some amount of cli command happens in
parallel, may trigger excessively allocating memory and causes JVMs to aggressively running
garbage collection and increases chances of dead lock in highly fragmented memory pages. 
It is somewhat a serious bug that I think it is worth while to be included in 2.x releases.

> Remove hardcoded JAVA_HEAP_MAX in hadoop-config.sh
> --------------------------------------------------
>
>                 Key: HADOOP-10759
>                 URL: https://issues.apache.org/jira/browse/HADOOP-10759
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: bin
>    Affects Versions: 2.4.0
>         Environment: Linux64
>            Reporter: sam liu
>            Priority: Minor
>             Fix For: 2.6.0
>
>         Attachments: HADOOP-10759.patch, HADOOP-10759.patch
>
>
> In hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh, there is a hard
code for Java parameter: 'JAVA_HEAP_MAX=-Xmx1000m'. It should be removed.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message