hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stefan Groschupf ...@101tec.com>
Subject [memory leak?] Re: MapReduce failure
Date Sun, 16 Mar 2008 03:37:08 GMT
Hi there,

we see the same situation and browsing the posts there are quite a lot  
of people running into this OOM problem.
We run a own Mapper and our mapred.child.java.opts is -Xmx3048m, I  
think that should be more then enough.
Also I changed io.sort.mb to 10, which had also no impact.

Any ideas what might cause the OutOfMemoryError ?
Thanks.
Stefan




On Mar 9, 2008, at 10:28 PM, Amar Kamat wrote:

> What is the heap size you are using for your tasks? Check  
> 'mapred.child.java.opts' in your hadoop-default.xml. Try increasing  
> it. This will happen if you try running the random-writer + sort  
> examples with default parameters. The maps are not able to spill the  
> data to the disk. Btw what version of HADOOP are you using?
> Amar
> On Mon, 10 Mar 2008, Ved Prakash wrote:
>
>> Hi friends,
>>
>> I have made a cluster of 3 machines, one of them is master, and  
>> other 2
>> slaves. I executed a mapreduce job on master but after Map, the  
>> execution
>> terminates and Reduce doesn't happen. I have checked dfs and no  
>> output
>> folder gets created.
>>
>> this is the error I see
>>
>> 08/03/10 10:35:21 INFO mapred.JobClient: Task Id :
>> task_200803101001_0001_m_000064_0, Status : FAILED
>> java.lang.OutOfMemoryError: Java heap space
>>       at  
>> java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java
>> :95)
>>       at java.io.DataOutputStream.write(DataOutputStream.java:90)
>>       at org.apache.hadoop.io.Text.write(Text.java:243)
>>       at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(
>> MapTask.java:347)
>>       at org.apache.hadoop.examples.WordCount 
>> $MapClass.map(WordCount.java
>> :72)
>>       at org.apache.hadoop.examples.WordCount 
>> $MapClass.map(WordCount.java
>> :59)
>>       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
>>       at org.apache.hadoop.mapred.TaskTracker 
>> $Child.main(TaskTracker.java
>> :1787)
>>
>> 08/03/10 10:35:22 INFO mapred.JobClient:  map 55% reduce 17%
>> 08/03/10 10:35:31 INFO mapred.JobClient:  map 56% reduce 17%
>> 08/03/10 10:35:51 INFO mapred.JobClient:  map 57% reduce 17%
>> 08/03/10 10:36:04 INFO mapred.JobClient:  map 58% reduce 17%
>> 08/03/10 10:36:07 INFO mapred.JobClient:  map 57% reduce 17%
>> 08/03/10 10:36:07 INFO mapred.JobClient: Task Id :
>> task_200803101001_0001_m_000071_0, Status : FAILED
>> java.lang.OutOfMemoryError: Java heap space
>>       at  
>> java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java
>> :95)
>>       at java.io.DataOutputStream.write(DataOutputStream.java:90)
>>       at org.apache.hadoop.io.Text.write(Text.java:243)
>>       at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(
>> MapTask.java:347)
>>       at org.apache.hadoop.examples.WordCount 
>> $MapClass.map(WordCount.java
>> :72)
>>       at org.apache.hadoop.examples.WordCount 
>> $MapClass.map(WordCount.java
>> :59)
>>       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
>>       at org.apache.hadoop.mapred.TaskTracker 
>> $Child.main(TaskTracker.java
>> :1787)
>>
>> though it tries to overcome this problem but the mapreduce  
>> application
>> doesn't create output, can anyone tell me why is this happening?
>>
>> Thanks
>>
>

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
101tec Inc.
Menlo Park, California, USA
http://www.101tec.com



Mime
View raw message