Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A386D10BC3 for ; Thu, 27 Feb 2014 04:06:37 +0000 (UTC) Received: (qmail 8072 invoked by uid 500); 27 Feb 2014 04:06:28 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 7828 invoked by uid 500); 27 Feb 2014 04:06:25 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 7820 invoked by uid 99); 27 Feb 2014 04:06:24 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Feb 2014 04:06:24 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of sudhakara.st@gmail.com designates 209.85.219.49 as permitted sender) Received: from [209.85.219.49] (HELO mail-oa0-f49.google.com) (209.85.219.49) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Feb 2014 04:06:18 +0000 Received: by mail-oa0-f49.google.com with SMTP id i4so1897630oah.8 for ; Wed, 26 Feb 2014 20:05:57 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=okpp9FJ3pv1GJk7armhQoKqg+St1DJFRPpzZ+0RZ2wo=; b=Vgpa7l8popQTpLb0gkiMu7YwS/qKWlcTFXkS+RX2eUH17aUjqWDI/KkU19VXbwNQRs sqnn7N1NAXr7TK/koVxsZKzaeW5IwhPDgDXJNjKpCvYneOalQoRB9uuY+fC/R/GBp6uF KrzLcScg7CvpfxYn1eRMV5y4DsDNnRPpkKHmv4j54NMpOawqXf35TC/8b5lOL/DlOPud YK9XR8IsjWtg0lE05RhthFeAilOLc7I5bEiyOsQi18vGZYlnun6k+IDnC0Lan3Yyp4hL XSNz0YLW8x+QlIJ7YP6gu15AGlZ9HI3Te59+U/F2/+06VxVqQ07iDqeX4RD4/CWTQSHF Fpyw== MIME-Version: 1.0 X-Received: by 10.182.194.100 with SMTP id hv4mr122990obc.80.1393473956968; Wed, 26 Feb 2014 20:05:56 -0800 (PST) Received: by 10.76.144.99 with HTTP; Wed, 26 Feb 2014 20:05:56 -0800 (PST) In-Reply-To: <2014022711264611346012@gmail.com> References: <2014022711264611346012@gmail.com> Date: Thu, 27 Feb 2014 09:35:56 +0530 Message-ID: Subject: Re: hadoop Exception: java.io.IOException: Couldn't set up IO streams From: sudhakara st To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d0444e8d1357ebb04f35b705c X-Virus-Checked: Checked by ClamAV on apache.org --f46d0444e8d1357ebb04f35b705c Content-Type: text/plain; charset=ISO-8859-1 *"Caused by: java.lang.OutOfMemoryError: unable to create new native thread"* this lines shows out memory error in the destination host is: " l-hadoop2.prod.cn2.corp.agrant.cn", The destination host throw error it unable process RPC request. Check with memory allocations for java processes On Thu, Feb 27, 2014 at 8:56 AM, leiwangouc@gmail.com wrote: > > > Hi all, > > I write a pig script and run it on hadoop cluster. Sometimes there will > be this exception and sometimes not. > > java.io.IOException: Failed on local exception: java.io.IOException: > Couldn't set up IO streams; Host Details : local host is: " > l-hadoop13.prod.cn2.corp.agrant.cn/10.2.1.45"; destination host is: " > l-hadoop2.prod.cn2.corp.agrant.cn":8020; > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) > at org.apache.hadoop.ipc.Client.call(Client.java:1235) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) > at com.sun.proxy.$Proxy9.delete(Unknown Source) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:408) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) > at com.sun.proxy.$Proxy10.delete(Unknown Source) > at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1487) > at > org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:354) > at > org.apache.pig.backend.hadoop.datastorage.HPath.delete(HPath.java:118) > at > org.apache.pig.impl.io.FileLocalizer.deleteTempFiles(FileLocalizer.java:491) > at org.apache.pig.Main.run(Main.java:642) > at org.apache.pig.Main.main(Main.java:157) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:208) > Caused by: java.io.IOException: Couldn't set up IO streams > at > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:663) > at > org.apache.hadoop.ipc.Client$Connection.access$2100(Client.java:243) > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1284) > at org.apache.hadoop.ipc.Client.call(Client.java:1202) > ... 21 more > Caused by: java.lang.OutOfMemoryError: unable to create new native thread > at java.lang.Thread.start0(Native Method) > at java.lang.Thread.start(Thread.java:640) > at > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:656) > > ... 24 mor > Hadoop verion: Hadoop 2.0.0-cdh4.3.1. > > Any insight on this? How could i fix it? Why this error occurs > occasionally? > > Thanks, > Lei > ------------------------------ > leiwangouc@gmail.com > -- Regards, ...sudhakara --f46d0444e8d1357ebb04f35b705c Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
"Caused by: java.lang.OutOfMemoryError: unabl= e to create new native=20 thread" this lines shows out memory error in the destination host = is:=20 "l-hadoop2.prod.cn2.corp.agrant.cn",
The destination hos= t throw error it unable process RPC request. Check with memory allocations = for java processes


On Thu= , Feb 27, 2014 at 8:56 AM, leiwango= uc@gmail.com <leiwangouc@gmail.com> wrote:
=A0
=A0
Hi all,
=A0
I write=A0a pig script and run it on hadoop cluster.=A0 Sometimes=20 there will be this exception and sometimes not.
=A0
java.io.IOException: Failed on local exception: java.io.IOException:= =20 Couldn't set up IO streams; Host Details : local host is:=20 "l-hadoop13.prod.cn2.corp.agrant.cn/10.2.1.45"; destin= ation host is:=20 "l-hadoop2.prod.cn2.corp.agrant.cn":8020;
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.Client.call(Client.java:1235)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.ja= va:202)
=A0=A0=A0=A0=A0=A0=A0 at=20 com.sun.proxy.$Proxy9.delete(Unknown Source)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete= (ClientNamenodeProtocolTranslatorPB.java:408)
=A0=A0=A0=A0=A0=A0=A0 at=20 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=A0=A0=A0=A0=A0=A0=A0 at=20 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3= 9)
=A0=A0=A0=A0=A0=A0=A0 at=20 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:25)
=A0=A0=A0=A0=A0=A0=A0 at=20 java.lang.reflect.Method.invoke(Method.java:597)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocat= ionHandler.java:164)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHan= dler.java:83)
=A0=A0=A0=A0=A0=A0=A0 at=20 com.sun.proxy.$Proxy10.delete(Unknown Source)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1487)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.j= ava:354)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.pig.backend.hadoop.datastorage.HPath.delete(HPath.java:118)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.pig.impl.io.FileLocalizer.deleteTempFiles(FileLocalizer.java:491= )
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.pig.Main.run(Main.java:642)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.pig.Main.main(Main.java:157)
=A0=A0=A0=A0=A0=A0=A0 at=20 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=A0=A0=A0=A0=A0=A0=A0 at=20 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:3= 9)
=A0=A0=A0=A0=A0=A0=A0 at=20 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:25)
=A0=A0=A0=A0=A0=A0=A0 at=20 java.lang.reflect.Method.invoke(Method.java:597)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.io.IOException: Couldn't set up IO streams
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:663)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.Client$Connection.access$2100(Client.java:243)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.Client.getConnection(Client.java:1284)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.Client.call(Client.java:1202)
=A0=A0=A0=A0=A0=A0=A0 ... 21 more
Caused by: java.lang.OutOfMemoryError: unable to create new native=20 thread
=A0=A0=A0=A0=A0=A0=A0 at=20 java.lang.Thread.start0(Native Method)
=A0=A0=A0=A0=A0=A0=A0 at=20 java.lang.Thread.start(Thread.java:640)
=A0=A0=A0=A0=A0=A0=A0 at=20 org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:656)
=A0
=A0=A0=A0=A0=A0=A0 ... 24 mor
=A0Hadoop verion: Hadoop 2.0.0-cdh4.3.1.
=A0
Any insight on this? How could i fix it? Why this error occurs=20 occasionally?
=A0
Thanks,
Lei




--
=A0 = =A0 =A0=A0
Regards,
...sudhakara
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0
--f46d0444e8d1357ebb04f35b705c--