hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shing Hing Man <mat...@yahoo.com>
Subject Re: Pseudo mode :Reduce task failed when there are more than one reducers
Date Sun, 10 Jul 2011 18:55:24 GMT
After setting   mapred.child.java.opts=-Xmx512m (the default was 200m),   I no longer get
the exception.

Shing 




________________________________
From: Shing Hing Man <matmsh@yahoo.com>
To: "mapreduce-user@hadoop.apache.org" <mapreduce-user@hadoop.apache.org>
Sent: Wednesday, 6 July 2011, 21:37
Subject: Pseudo mode :Reduce task  failed when there are more than one  reducers 


Hi,

In Pseudo mode, when I have set no number of reducer to 2,  I get the following error from
a reduce task.
11/07/06 21:15:59 INFO mapreduce.Job: Task Id : attempt_201107062043_0002_r_000000_2, Status
: FAILED
org.apache.hadoop.mapreduce.task.reduce.Shuffle$ShuffleError: error in shuffle in OnDiskMerger
- Thread to merge on-disk map-outputs
        at org.apache.hadoop.mapreduce.task.reduce.Shuffle.run(Shuffle.java:124)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:362)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:217)
        at java.security.AccessController.doPrivileged(Native
 Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:742)
        at org.apache.hadoop.mapred.Child.main(Child.java:211)
Caused by: java.lang.RuntimeException: java.io.EOFException
        at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:132)
        at org.apache.hadoop.mapred.Merger$MergeQueue.lessThan(Merger.java:530)
        at org.apache.hadoop.util.PriorityQueue.downHeap(PriorityQueue.java:141)
        at org.apache.hadoop.util.PriorityQueue.adjustTop(PriorityQueue.java:108)
        at
 org.apache.hadoop.mapred.Merger$MergeQueue.adjustPriorityQueue(Mer


But when I set number of reducer to 1, the job completes without the above error. 
I am using Hadoop 0.21.0.

Thanks in advance for any assistance!

Shing 
Mime
View raw message