hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sanjay Dahiya (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-817) Streaming reducers throw OutOfMemory for not so large inputs
Date Fri, 15 Dec 2006 20:59:22 GMT
    [ http://issues.apache.org/jira/browse/HADOOP-817?page=comments#action_12458930 ] 
Sanjay Dahiya commented on HADOOP-817:

tracked it down to MergeQueue.merge(), after a couple of iterations of that loop it runs out
of memory. 

> Streaming reducers throw OutOfMemory for not so large inputs
> ------------------------------------------------------------
>                 Key: HADOOP-817
>                 URL: http://issues.apache.org/jira/browse/HADOOP-817
>             Project: Hadoop
>          Issue Type: Bug
>          Components: contrib/streaming
>            Reporter: Sanjay Dahiya
> I am seeing OutOfMemoryError for moderate size inputs (~70 text files, 20k each ) causing
job to fail in streaming. For very small inputs it still succeeds. Looking into details. 

This message is automatically generated by JIRA.
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira


View raw message