hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raghu Angadi <rang...@yahoo-inc.com>
Subject Re: DataXceiver: java.io.IOException: Connection reset by peer
Date Tue, 01 Jul 2008 19:22:23 GMT

This is mostly harmless. We should remove these message. What mostly 
happened is that client opens a file to read x bytes and closes the 
connection after reading x bytes. But datanode does not know that (while 
using normal read() interface).

Please file a jira to get rid of this message. It is really confusing to 
the user and pollutes the log file.

Raghu.

Rong-en Fan wrote:
> Hi,
> 
> I'm using Hadoop 0.17.1 with HBase trunk, and notice lots of exception
> in hadoop's log (it's a 3-node hdfs):
> 
> 2008-06-30 19:27:45,760 ERROR org.apache.hadoop.dfs.DataNode: 192.168.23.1:500
> 10:DataXceiver: java.io.IOException: Connection reset by peer
>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:104)
>         at sun.nio.ch.IOUtil.write(IOUtil.java:75)
>         at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:334)
>         at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:53)
>         at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:140)
>         at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:144)
>         at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:105)
>         at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
>         at java.io.DataOutputStream.write(DataOutputStream.java:90)
>         at org.apache.hadoop.dfs.DataNode$BlockSender.sendChunks(DataNode.java:1774)
>         at org.apache.hadoop.dfs.DataNode$BlockSender.sendBlock(DataNode.java:1813)
>         at org.apache.hadoop.dfs.DataNode$DataXceiver.readBlock(DataNode.java:1039)
>         at org.apache.hadoop.dfs.DataNode$DataXceiver.run(DataNode.java:968)
>         at java.lang.Thread.run(Thread.java:619)
> 
> It seems to me that the datanode can not handle the incoming traffic.
> If so, what parameters in hadoop sire and/or in os (I'm using rhel 4) that
> I can play with?
> 
> Thanks,
> Rong-En Fan


Mime
View raw message