hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Zizon Qiu <zzd...@gmail.com>
Subject Re: hanging context.write() with large arrays
Date Sat, 05 May 2012 14:36:41 GMT
for the timeout problem,you can use a background thread that invoke
context.progress() timely which do "keep-alive" for forked
Child(mapper/combiner/reducer)...
it is tricky but works.

On Sat, May 5, 2012 at 10:05 PM, Zuhair Khayyat <zuhair.khayyat@kaust.edu.sa
> wrote:

> Hi,
>
> I am building a MapReduce application that constructs the adjacency list
> of a graph from an input edge list. I noticed that my Reduce phase always
> hangs (and timeout eventually) as it calls the function
> context.write(Key_x,Value_x) when the Value_x is a very large ArrayWritable
> (around 4M elements). I have increased both "mapred.task.timeout" and the
> Reducers memory but no luck; the reducer does not finish the job. Is there
> any other data format that supports large amount of data or should I use my
> own "OutputFormat" class to optimize writing the large amount of data?
>
>
> Thank you.
> Zuhair Khayyat
>

Mime
View raw message