hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arun C Murthy <...@yahoo-inc.com>
Subject Re: OutOfMemory error processing large amounts of gz files
Date Thu, 26 Feb 2009 09:50:10 GMT

On Feb 24, 2009, at 4:03 PM, bzheng wrote:
>

> 2009-02-23 14:27:50,902 INFO org.apache.hadoop.mapred.TaskTracker:
> java.lang.OutOfMemoryError: Java heap space
>

That tells that that your TaskTracker is running out of memory, not  
your reduce tasks.

I think you are hitting http://issues.apache.org/jira/browse/ 
HADOOP-4906.

What version of hadoop are you running?

Arun


Mime
View raw message