hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Christian Kunz (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-2907) dead datanodes because of OutOfMemoryError
Date Wed, 27 Feb 2008 03:24:51 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-2907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12572765#action_12572765
] 

Christian Kunz commented on HADOOP-2907:
----------------------------------------

I found a couple of datanodes with a different exception (still OutOfMemoryError):

in the out file:
Exception in thread "org.apache.hadoop.dfs.DataNode$DataXceiver@9d54d" java.lang.OutOfMemoryError:
Java heap space

in the log file:
2008-02-26 23:01:12,810 ERROR org.apache.hadoop.dfs.DataNode: xxx.xxx.xxx.101:50010:DataXceiver:
java.lang.OutOfMemoryError: Java heap space
        at java.io.BufferedInputStream.<init>(BufferedInputStream.java:178)
        at org.apache.hadoop.dfs.DataNode$DataXceiver.run(DataNode.java:919)
        at java.lang.Thread.run(Thread.java:619)
or

2008-02-27 01:30:20,703 ERROR org.apache.hadoop.dfs.DataNode: xxx.xxx.xxx.136:50010:DataXceiver:
java.lang.OutOfMemoryError: Java heap space
        at java.io.BufferedInputStream.<init>(BufferedInputStream.java:178)
        at org.apache.hadoop.dfs.DataNode$BlockSender.<init>(DataNode.java:1521)
        at org.apache.hadoop.dfs.DataNode$DataXceiver.readBlock(DataNode.java:992)
        at org.apache.hadoop.dfs.DataNode$DataXceiver.run(DataNode.java:930)
        at java.lang.Thread.run(Thread.java:619)



> dead datanodes because of OutOfMemoryError
> ------------------------------------------
>
>                 Key: HADOOP-2907
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2907
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: dfs
>    Affects Versions: 0.16.0
>            Reporter: Christian Kunz
>
> We see more dead datanodes than in previous releases. The common exception is found in
the out file:
> Exception in thread "org.apache.hadoop.dfs.DataBlockScanner@18166e5" java.lang.OutOfMemoryError:
Java heap space
> Exception in thread "DataNode: [dfs.data.dir-value]" java.lang.OutOfMemoryError: Java
heap space

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message