Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D28FB11A63 for ; Mon, 7 Jul 2014 01:35:12 +0000 (UTC) Received: (qmail 72215 invoked by uid 500); 7 Jul 2014 01:35:07 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 72062 invoked by uid 500); 7 Jul 2014 01:35:07 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 72051 invoked by uid 99); 7 Jul 2014 01:35:07 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 07 Jul 2014 01:35:07 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of liulei412@gmail.com designates 209.85.220.177 as permitted sender) Received: from [209.85.220.177] (HELO mail-vc0-f177.google.com) (209.85.220.177) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 07 Jul 2014 01:35:03 +0000 Received: by mail-vc0-f177.google.com with SMTP id ij19so3232475vcb.8 for ; Sun, 06 Jul 2014 18:34:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:from:date:message-id:subject:to:content-type; bh=XJTlSVzOLka9eqrBbWa2s3TlkawJjL7GdZlf4vvBveA=; b=ZEvCYdL+Ly3ys7VUotbotq4RoC2gCiwdFPqsvbPTemNymnLg981aAbWzeQV5GWeBz8 Ua2VeyfWGJOv4YNzqfQEBwpic6c9NEL7VQh7fvAIHiosKmZNFkXACookpnrcMAMRbiMn /wcGjalX2ZG1IJXxYPQajZrHy8hBWxHoieIwST+QHyxAMqBEUW5StZbrL0BxF6HC/WSl bj7ifAjnu3TUsDsRjjm/WGdy2nqcUnorlSnOeVLrwUImWujXjXvlNbX5PVtxv6HByR6C Pl3blKEITXrF01FlhsWaFjYQ9mI9fRAn8stO8OJlbbRyulurk/FxEk/mmDioiQSNQbAk p8rg== X-Received: by 10.220.103.141 with SMTP id k13mr24464441vco.25.1404696878780; Sun, 06 Jul 2014 18:34:38 -0700 (PDT) MIME-Version: 1.0 Received: by 10.220.122.202 with HTTP; Sun, 6 Jul 2014 18:34:18 -0700 (PDT) From: lei liu Date: Mon, 7 Jul 2014 09:34:18 +0800 Message-ID: Subject: java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable To: user@hadoop.apache.org, cdh-user@cloudera.org Content-Type: multipart/alternative; boundary=047d7b342d307a152c04fd907adc X-Virus-Checked: Checked by ClamAV on apache.org --047d7b342d307a152c04fd907adc Content-Type: text/plain; charset=UTF-8 I use hbase-0.94 and hadoop-2.2, there is below exception: 2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to connect to DomainSocket(fd=322,path=/home/hadoop/hadoop-current/cdh4-dn-socket/dn_socket) java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native Method) at org.apache.hadoop.net.unix.DomainSocket.access$200(DomainSocket.java:47) at org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:530) at java.io.FilterInputStream.read(FilterInputStream.java:66) at org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:169) at org.apache.hadoop.hdfs.BlockReaderFactory.newShortCircuitBlockReader(BlockReaderFactory.java:187) at org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:104) at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1060) at org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange(DFSInputStream.java:898) at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:1148) at org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:73) at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1388) at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1880) at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1723) at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:365) at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633) at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:730) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:128) why does appear the exception "java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable"? Thanks, LiuLei --047d7b342d307a152c04fd907adc Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
I use hbase-0.94 and hadoop-2.2, there is below exception:=

2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to co= nnect to DomainSocket(fd=3D322,path=3D/home/hadoop/hadoop-current/cdh4-dn-s= ocket/dn_socket)

java.net.SocketTimeoutException: read(2) error: Resource temporarily unavai= lable

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.net.unix.DomainSocket.read= Array0(Native Method)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.net.unix.DomainSocket.acce= ss$200(DomainSocket.java:47)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.net.unix.DomainSocket$Doma= inInputStream.read(DomainSocket.java:530)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.io.FilterInputStream.read(FilterInputSt= ream.java:66)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.protocol.HdfsProtoUti= l.vintPrefixed(HdfsProtoUtil.java:169)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.BlockReaderFactory.ne= wShortCircuitBlockReader(BlockReaderFactory.java:187)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.BlockReaderFactory.ne= wBlockReader(BlockReaderFactory.java:104)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DFSInputStream.getBlo= ckReader(DFSInputStream.java:1060)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DFSInputStream.fetchB= lockByteRange(DFSInputStream.java:898)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DFSInputStream.read(D= FSInputStream.java:1148)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.fs.FSDataInputStream.read(= FSDataInputStream.java:73)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.io.hfile.HFileBlock$= AbstractFSReader.readAtOffset(HFileBlock.java:1388)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.io.hfile.HFileBlock$= FSReaderV2.readBlockDataInternal(HFileBlock.java:1880)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.io.hfile.HFileBlock$= FSReaderV2.readBlockData(HFileBlock.java:1723)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.io.hfile.HFileReader= V2.readBlock(HFileReaderV2.java:365)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.io.hfile.HFileReader= V2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.io.hfile.HFileReader= V2$ScannerV2.next(HFileReaderV2.java:730)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hbase.regionserver.StoreFi= leScanner.next(StoreFileScanner.java:128)



= why does appear the exception "java.net.SocketTimeoutException: read(2= ) error: Resource temporarily unavailable"?


= Thanks,

= LiuLei
--047d7b342d307a152c04fd907adc--