giraph-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Franco Maria Nardini <>
Subject memory problem
Date Thu, 13 Sep 2012 21:01:36 GMT
Hi all,

I am running the PageRank code on single machine hadoop installation.
In particular, I am running the code using four workers on a graph of
100.000 nodes. I am getting this exception:

012-09-13 22:54:38,379 INFO org.apache.hadoop.mapred.Task:
Communication exception: java.lang.OutOfMemoryError: GC overhead limit
	at org.apache.hadoop.util.ProcfsBasedProcessTree.getProcessList(
	at org.apache.hadoop.util.ProcfsBasedProcessTree.getProcessTree(
	at org.apache.hadoop.util.LinuxResourceCalculatorPlugin.getProcResourceValues(
	at org.apache.hadoop.mapred.Task.updateResourceCounters(
	at org.apache.hadoop.mapred.Task.updateCounters(
	at org.apache.hadoop.mapred.Task.access$600(
	at org.apache.hadoop.mapred.Task$

It's seems to be a problem related to communication stuff. Do I have
to specify or modify some options to hadoop to avoid this?


Franco Maria

View raw message