Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 4299 invoked from network); 31 Mar 2011 06:39:01 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 31 Mar 2011 06:39:01 -0000 Received: (qmail 85694 invoked by uid 500); 31 Mar 2011 06:38:58 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 85580 invoked by uid 500); 31 Mar 2011 06:38:57 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 85228 invoked by uid 99); 31 Mar 2011 06:38:54 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 31 Mar 2011 06:38:54 +0000 X-ASF-Spam-Status: No, hits=4.6 required=5.0 tests=HTML_MESSAGE,NO_RDNS_DOTCOM_HELO,RCVD_IN_DNSWL_NONE,SPF_NEUTRAL,URI_HEX X-Spam-Check-By: apache.org Received-SPF: neutral (athena.apache.org: local policy) Received: from [69.147.107.20] (HELO mrout1-b.corp.re1.yahoo.com) (69.147.107.20) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 31 Mar 2011 06:38:48 +0000 Received: from EGL-EX07CAS03.ds.corp.yahoo.com (egl-ex07cas03.eglbp.corp.yahoo.com [203.83.248.219]) by mrout1-b.corp.re1.yahoo.com (8.14.4/8.14.4/y.out) with ESMTP id p2V6bv1r060141 for ; Wed, 30 Mar 2011 23:37:57 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=simple/simple; d=yahoo-inc.com; s=cobra; t=1301553478; bh=gBz3dUT2x5cAB9ksf3ZtS8VNK24PJ52uEwjOpLJxixI=; h=From:To:Date:Subject:Message-ID:In-Reply-To:Content-Type: MIME-Version; b=JIafo6pgsfrFwmdIck7K18mwys2yTRV+xKHLEB1sRDT7rGVdP0WUiC0/FzgeJvykr Z+kXfBMKFh4dQhPg1ZKxo/C3r/MCZoq9xmXDzn8+FaZDkY7ockY2aPIlrvq4HgXrrW 9JWzufrQ4hjAUJFkXeiAvh54bES8TejYmXn61Bq0= Received: from EGL-EX07VS01.ds.corp.yahoo.com ([203.83.248.205]) by EGL-EX07CAS03.ds.corp.yahoo.com ([203.83.248.219]) with mapi; Thu, 31 Mar 2011 12:07:56 +0530 From: Amareshwari Sri Ramadasu To: "common-user@hadoop.apache.org" Date: Thu, 31 Mar 2011 12:07:54 +0530 Subject: Re: Hadoop Pipes Error Thread-Topic: Hadoop Pipes Error Thread-Index: AcvvXrAqHmCahAGpTTSCb2AVWDh92QAD3nTH Message-ID: In-Reply-To: <4D9406FF.9080103@orkash.com> Accept-Language: en-US Content-Language: en X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: multipart/alternative; boundary="_000_C9BA1F1A28D15amarsriyahooinccom_" MIME-Version: 1.0 --_000_C9BA1F1A28D15amarsriyahooinccom_ Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Here is an answer for your question in old mail archive: http://lucene.472066.n3.nabble.com/pipe-application-error-td650185.html On 3/31/11 10:15 AM, "Adarsh Sharma" wrote: Any update on the below error. Please guide. Thanks & best Regards, Adarsh Sharma Adarsh Sharma wrote: > Dear all, > > Today I faced a problem while running a map-reduce job in C++. I am > not able to understand to find the reason of the below error : > > > 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000000_0, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101= ) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358= ) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at org.apache.hadoop.mapred.Child.main(Child.java:170) > Caused by: java.io.EOFException > at java.io.DataInputStream.readByte(DataInputStream.java:250) > at > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) > at > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) > at > org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(Bina= ryProtocol.java:114) > > attempt_201103301130_0011_m_000000_0: Hadoop Pipes Exception: failed > to open at wordcount-nopipe.cc:82 in > WordCountReader::WordCountReader(HadoopPipes::MapContext&) > 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000001_0, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101= ) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358= ) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at org.apache.hadoop.mapred.Child.main(Child.java:170) > Caused by: java.io.EOFException > at java.io.DataInputStream.readByte(DataInputStream.java:250) > at > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) > at > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) > at > org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(Bina= ryProtocol.java:114) > > attempt_201103301130_0011_m_000001_0: Hadoop Pipes Exception: failed > to open at wordcount-nopipe.cc:82 in > WordCountReader::WordCountReader(HadoopPipes::MapContext&) > 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000002_0, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101= ) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358= ) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at org.apache.hadoop.mapred.Child.main(Child.java:170) > Caused by: java.io.EOFException > at java.io.DataInputStream.readByte(DataInputStream.java:250) > at > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) > at > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) > at > org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(Bina= ryProtocol.java:114) > attempt_201103301130_0011_m_000002_1: Hadoop Pipes Exception: failed > to open at wordcount-nopipe.cc:82 in > WordCountReader::WordCountReader(HadoopPipes::MapContext&) > 11/03/30 12:09:15 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000000_2, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101= ) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:35 > > I tried to run *wordcount-nopipe.cc* program in > */home/hadoop/project/hadoop-0.20.2/src/examples/pipes/impl* directory. > > > make wordcount-nopipe > bin/hadoop fs -put wordcount-nopipe bin/wordcount-nopipe > bin/hadoop pipes -D hadoop.pipes.java.recordreader=3Dtrue -D > hadoop.pipes.java.recordwriter=3Dtrue -input gutenberg -output > gutenberg-out11 -program bin/wordcount-nopipe > > or > bin/hadoop pipes -D hadoop.pipes.java.recordreader=3Dfalse -D > hadoop.pipes.java.recordwriter=3Dfalse -input gutenberg -output > gutenberg-out11 -program bin/wordcount-nopipe > > but error remains the same. I attached my Makefile also. > Please have some comments on it. > > I am able to wun a simple wordcount.cpp program in Hadoop Cluster but > don't know why this program fails in Broken Pipe error. > > > > Thanks & best regards > Adarsh Sharma --_000_C9BA1F1A28D15amarsriyahooinccom_--