hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Peter W." <pe...@marketingbrokers.com>
Subject Re: Mapper Out of Memory
Date Thu, 06 Dec 2007 22:02:25 GMT
Hello,

There is a setting in hadoop-0.15.0/bin/rcc

default:

JAVA_HEAP_MAX=-Xmx1000m

For 2GB memory you can set this about:

JAVA_HEAP_MAX=-Xmx1700m

2048m is the highest allowed setting on a mac, linux,
non-solaris unix or windows box.

Peter W.

On Dec 6, 2007, at 12:30 PM, Rui Shi wrote:

>
> Hi,
>
> I run hadoop on a BSD4 clusters and each map task is a gzip file  
> (about 10MB). Some tasks finished. But many of them failed due to  
> heap out of memory. I got the following syslogs:
>
> 2007-12-06 12:16:50,277 INFO  
> org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics  
> with processName=MAP, sessionId=
> 2007-12-06 12:16:53,128 INFO org.apache.hadoop.mapred.MapTask:  
> numReduceTasks: 256
> 2007-12-06 12:16:53,638 WARN  
> org.apache.hadoop.util.NativeCodeLoader: Unable to load native- 
> hadoop library for your platform... using builtin-java classes  
> where applicable
> 2007-12-06 12:18:19,079 WARN org.apache.hadoop.mapred.TaskTracker:  
> Error running child
> java.lang.OutOfMemoryError: Java heap space
> Does anyone know what is the reason and how should we avoid it?
>
> Thanks,
>
> Rui

Mime
View raw message