hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mapred Learn <mapred.le...@gmail.com>
Subject how to set huge memory for reducer in streaming
Date Sun, 29 Jul 2012 08:17:14 GMT
One of my programs create a huge python dictionary and reducers fails with
Memory Error everytime.

Is there a way to specify reducer memory to be a bigger value for reducers
to succeed ?

I know we shuold not have this requirement in first place and not cerate
this kind of dictionary, but still can I finish this job with giving more
memory in jar command ?


View raw message