hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dhanasekaran Anbalagan <bugcy...@gmail.com>
Subject Best way tune in to Hadoop Heap size Parameter
Date Mon, 11 Mar 2013 14:11:13 GMT
Hi Guys,

We have problem with production Hadoop cluster most of the time we seeing
Java heap size issue.
one of the hadoop component goes to out of Memory Error.

2013-03-08 08:01:10,749 WARN org.apache.hadoop.ipc.Server: IPC Server
handler 57 on 8020, call
org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.blockReport from *error: java.lang.OutOfMemoryError: Java heap space*
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2760)
 at java.util.Arrays.copyOf(Arrays.java:2734)

It's any mechanism is there fine tune the Name-node and job tracker heap

In my current scenario, we have 181 TB DFS size, We are keep on putting
data to cluster.

I want any calculation is there [ like formula ], Data size related with
Namenode Heap size and Datanode Heap size.

Please guide me.


Did I learn something today? If not, I wasted it.

View raw message