hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sanjay Dahiya (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-817) Streaming reducers throw OutOfMemory for not so large inputs
Date Mon, 18 Dec 2006 11:31:22 GMT
    [ http://issues.apache.org/jira/browse/HADOOP-817?page=comments#action_12459274 ] 
            
Sanjay Dahiya commented on HADOOP-817:
--------------------------------------

After a bit of profiling i see 4 buffers of size 20971520 (FSDataInputStream.Buffer) and 11
buffers of size 9532509. These are in first and second iteration respectively, and not cleaned
up before VM runs out of memory. Trace snapshot attached. debugging further. 

> Streaming reducers throw OutOfMemory for not so large inputs
> ------------------------------------------------------------
>
>                 Key: HADOOP-817
>                 URL: http://issues.apache.org/jira/browse/HADOOP-817
>             Project: Hadoop
>          Issue Type: Bug
>          Components: contrib/streaming
>            Reporter: Sanjay Dahiya
>         Assigned To: Sanjay Dahiya
>
> I am seeing OutOfMemoryError for moderate size inputs (~70 text files, 20k each ) causing
job to fail in streaming. For very small inputs it still succeeds. Looking into details. 

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Mime
View raw message