mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Mahon <pma...@decarta.com>
Subject Re: Heap Size question.
Date Wed, 01 Jun 2011 15:19:16 GMT
That's a hadoop question rather than a mahout question. For a full 
answer you'll probably want to try the hadoop list.

That number is the heap for the jobtracker, for tracking which jobs 
are running, which have run, not the heap for the job driver or tasks 
themselves. The out of memory errors are likely to be in on tasks, and 
are not related to that cluster summary heap size number.

On 06/01/2011 04:35 AM, Ken Williams wrote:
>
>    Hi All,
>
>    I'm a bit confused about the values displayed on the 'jobtracker.jsp' page.
>    In particular, there's a section called 'Cluster Summary'.
>
>    I'm running a small 4-machine Hadoop cluster, and when I point a web-browser
>    at my master machine (http://master:50030/jobtracker.jsp) it displays,
>
>                 Cluster Summary (Heap Size is 15.5 MB / 1.89 GB)
>
>     What exactly do these figures mean ?
>
>     I know that the second figure (1.89 GB) is determined by the value of
>     the HADOOP_HEAPSIZE variable set in  'conf/hadoop-env.sh'. What I'm not
>     sure about is exactly what it means, or where the first value (15.5 MB) is determined
>     or what it means.
>
>     I'm guessing the 1.89 GB is the amount of heap-memory allocated to Hadoop
>     on each machine in the cluster. (Correct ?)
>
>     I have no idea what the 15.5 MB means or where it comes from. It never changes,
>     not even when a job is running, and I can't find any explanation in the documentation.
>
>     This page, https://issues.apache.org/jira/browse/HADOOP-4435, seems to suggest
>     that the 15.5 MB should be the amount of heap memory currently in use but since
>     this value never changes - not even when a job is running and I refresh the page
>     - I'm not convinced this is working.
>
>     I'm asking this question because I have a Mahout job which slowly comes to halt with
>     a lot of 'OutOfMemoryError: Java heap space' errors, before it is 'Killed'.
>
>     I'm using Hadoop 0.20.2 and the latest Mahout snapshot version.
>
>     Thanks for any help.
>
>             Ken
>
>   		 	   		

Mime
View raw message