Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id E402D10C35 for ; Fri, 3 Jan 2014 23:56:27 +0000 (UTC) Received: (qmail 49028 invoked by uid 500); 3 Jan 2014 23:56:22 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 48887 invoked by uid 500); 3 Jan 2014 23:56:22 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 48880 invoked by uid 99); 3 Jan 2014 23:56:22 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Jan 2014 23:56:22 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of xeonmailinglist@gmail.com designates 209.85.217.177 as permitted sender) Received: from [209.85.217.177] (HELO mail-lb0-f177.google.com) (209.85.217.177) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Jan 2014 23:56:15 +0000 Received: by mail-lb0-f177.google.com with SMTP id q8so8330363lbi.8 for ; Fri, 03 Jan 2014 15:55:55 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=9A7T6cC8fnZflP8LPpSbFe5uYRfboMa3Q1WNBn8IBGg=; b=g0tmaFP4u+WJLZVjyaGiRUsFvHt/1B4aINn9fpjhrdljS1vuV82IWU+ZycIcucsjsb Skg3Ob2S71naDlQCzSndBVLVT3FLAz+lBqcTeV7L4h1gG53WuJAugprJXdKyWSm1x3Er hJ8nDwxnncqyZvrz8zxtR10ldHNhkbGcuPs/druvxFnqyCuCM+RAC4oVDp9ExWSZ9U7o jVf/6bQpbwqkPTR082tGDXKcnLE48o4vt270U9PxgWRlN/jnMaZGo39pV+uJHjEr9VwK oJA4t46VsqMhH68X/wPRcpiIqzhjIp4yzckb1FBuH3pHYuRBx8VkVH+V1SMLhjmEeUNr KvdQ== MIME-Version: 1.0 X-Received: by 10.112.141.67 with SMTP id rm3mr23498249lbb.31.1388793355280; Fri, 03 Jan 2014 15:55:55 -0800 (PST) Received: by 10.112.38.73 with HTTP; Fri, 3 Jan 2014 15:55:55 -0800 (PST) Date: Fri, 3 Jan 2014 23:55:55 +0000 Message-ID: Subject: java.net.SocketTimeoutException in the Datanode From: xeon Mailinglist To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c33e4a9ba68004ef19a66c X-Virus-Checked: Checked by ClamAV on apache.org --001a11c33e4a9ba68004ef19a66c Content-Type: text/plain; charset=ISO-8859-1 I am running an wordcount example it MRv2, but I get this error in a Datanode. It looks that it is a problem in the network between the Namenode and the Datanode, but I am not sure. What is this error? How can I fix this problem? 2014-01-03 16:46:29,319 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-570096904-155.99.144.100-1388771741297:blk_-3952564661572372834_1072 received exception java.net.SocketTimeoutException: 60000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/155.99.144.101:50010remote=/ 155.99.144.101:44937] 2014-01-03 16:46:30,177 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: pcvm3-3.utahddc.geniracks.net:50010:DataXceiver error processing WRITE_BLOCK operation src: /155.99.144.101:44937 dest: / 155.99.144.101:50010 java.net.SocketTimeoutException: 60000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/155.99.144.101:50010remote=/ 155.99.144.101:44937] at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:159) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:129) at java.io.FilterInputStream.read(FilterInputStream.java:133) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at java.io.DataInputStream.read(DataInputStream.java:149) at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:192) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:414) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:644) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:506) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:98) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:65) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:219) at java.lang.Thread.run(Thread.java:701) --001a11c33e4a9ba68004ef19a66c Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
I am running an wordcount example it MRv2, but I get this = error in a Datanode. It looks that it is a problem in the network between t= he Namenode and the Datanode, but I am not sure.

What is= this error? How can I fix this problem?

2014-01-03 16:46:29,319 INFO org.apache.hadoop.hdf= s.server.datanode.DataNode: opWriteBlock BP-570096904-155.99.144.100-138877= 1741297:blk_-3952564661572372834_1072 received exception java.net.SocketTim= eoutException: 60000 millis timeout while waiting for channel to be ready f= or read. ch : java.nio.channels.SocketChannel[connected local=3D/155.99.144.101:50010 remote=3D/155.99.144.101:44937]
2014-01-03 16:46:30,177 ERROR org.apache.hadoop.hdfs.server.datanode.D= ataNode: pcvm3-3.utahddc.geniracks.net:50010:DataXceiver error processing W= RITE_BLOCK operation =A0src: /155.9= 9.144.101:44937 dest: /155.99.1= 44.101:50010
java.net.SocketTimeoutException: 60000 millis timeout while waiting fo= r channel to be ready for read. ch : java.nio.channels.SocketChannel[connec= ted local=3D/155.99.144.101:50010 remote=3D/155.99.144.101:44937]
=A0 =A0 =A0 =A0 at org.apache.hadoop.net.SocketIOWithTimeout.doIO(Sock= etIOWithTimeout.java:164)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ne= t.SocketInputStream.read(SocketInputStream.java:159)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:= 129)
=A0 =A0 =A0 =A0 at java.io.FilterInputStream.read(FilterInputStream.ja= va:133)
=A0 =A0 =A0 =A0 at java.io.BufferedInputStream.fill(Buffe= redInputStream.java:235)
=A0 =A0 =A0 =A0 at java.io.BufferedInput= Stream.read1(BufferedInputStream.java:275)
=A0 =A0 =A0 =A0 at java.io.BufferedInputStream.read(BufferedInputStrea= m.java:334)
=A0 =A0 =A0 =A0 at java.io.DataInputStream.read(DataI= nputStream.java:149)
=A0 =A0 =A0 =A0 at org.apache.hadoop.io.IOUt= ils.readFully(IOUtils.java:192)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocol.datatransfer.Packet= Receiver.doReadFully(PacketReceiver.java:213)
=A0 =A0 =A0 =A0 at = org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketRe= ceiver.java:134)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocol.datatransfer.Packet= Receiver.receiveNextPacket(PacketReceiver.java:109)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(B= lockReceiver.java:414)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.datanode.BlockReceive= r.receiveBlock(BlockReceiver.java:644)
=A0 =A0 =A0 =A0 at org.apa= che.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:506= )
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.op= WriteBlock(Receiver.java:98)
=A0 =A0 =A0 =A0 at org.apache.hadoop= .hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:65)
= =A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(D= ataXceiver.java:219)
=A0 =A0 =A0 =A0 at java.lang.Thread.run(Thread.java:701)



--001a11c33e4a9ba68004ef19a66c--