Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 33725105E6 for ; Tue, 5 Nov 2013 18:10:30 +0000 (UTC) Received: (qmail 84898 invoked by uid 500); 5 Nov 2013 18:10:19 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 84457 invoked by uid 500); 5 Nov 2013 18:10:18 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 84376 invoked by uid 99); 5 Nov 2013 18:10:17 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Nov 2013 18:10:17 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of vinodkv@hortonworks.com designates 209.85.192.177 as permitted sender) Received: from [209.85.192.177] (HELO mail-pd0-f177.google.com) (209.85.192.177) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Nov 2013 18:10:09 +0000 Received: by mail-pd0-f177.google.com with SMTP id p10so8870815pdj.36 for ; Tue, 05 Nov 2013 10:09:47 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:from:mime-version:subject:date:in-reply-to:to :references:message-id:content-type; bh=DA9vC34GWQ4wqPiN8D+FWLT31M5T3KK0bpnY8cE7Rt8=; b=Q1F/eq8LVstYK65hD3BN02SvfEX8MErJt94QdTUL3DB5Yp4ZbVh+ctslPxQy2TpjVQ eiEXltbTk7l0bhB2q9mJvsJ6u6CDKvfLs75Nbihefj43cPKLjLDGxjpfmUOT4WAT7o0B Swz2gFSi3raY+CVSsCz/GjavhyYy0sN6aYGKB2k+Ro7G+iDM86ROE3ZvIYN9KGdv29M3 YiBbesYYjM76BqamuojfCjdLjzYJS7nA5xo9m2FjAtUCTBtdY2tBvP+pPrFRacAd11zc m0ul7vHyCXt7ZJnPAa3CSStcAT/AsaUNp6QWwPrgO1EVtRo4etrak+w694sXnqVu9bD3 91Kw== X-Gm-Message-State: ALoCoQnMUn9qxHAHXYyvdLauPNFPIw2PcHnCUlNLobFCWe8dbuz1gQcOp7KacdJ3Qxd+oHId55XLx/6uRevr+PeUxfaFooNl5IniCYHbpYX8PmILVckQbzM= X-Received: by 10.66.179.143 with SMTP id dg15mr24357316pac.52.1383674987246; Tue, 05 Nov 2013 10:09:47 -0800 (PST) Received: from [10.11.2.73] ([192.175.27.2]) by mx.google.com with ESMTPSA id j9sm41943699paj.18.2013.11.05.10.09.39 for (version=TLSv1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Tue, 05 Nov 2013 10:09:39 -0800 (PST) From: Vinod Kumar Vavilapalli Mime-Version: 1.0 (Apple Message framework v1283) Subject: Re: Error while running Hadoop Source Code Date: Tue, 5 Nov 2013 10:09:36 -0800 In-Reply-To: <5d91d5bc1ed9e19b20a91c68ec239087@ufl.edu> To: user@hadoop.apache.org References: <2dc251abf0d49b28fcec6fedf5ba55ca@ufl.edu> <5d91d5bc1ed9e19b20a91c68ec239087@ufl.edu> Message-Id: <24AA4657-6F46-4E58-AC5E-CB5535E90280@hortonworks.com> X-Mailer: Apple Mail (2.1283) Content-Type: multipart/alternative; boundary="Apple-Mail=_585928A8-1C17-4001-979D-5C913C53D1AE" X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail=_585928A8-1C17-4001-979D-5C913C53D1AE Content-Type: text/plain; charset=ISO-8859-1 It seems like your pipes mapper is exiting before consuming all the input. Did you check the task-logs on the web UI? Thanks, +Vinod On Nov 5, 2013, at 7:25 AM, Basu,Indrashish wrote: > > Hi, > > Can anyone kindly assist on this ? > > Regards, > Indrashish > > > On Mon, 04 Nov 2013 10:23:23 -0500, Basu,Indrashish wrote: >> Hi All, >> >> Any update on the below post ? >> >> I came across some old post regarding the same issue. It explains the >> solution as " The *nopipe* example needs more documentation. It >> assumes that it is run with the InputFormat from >> src/test/org/apache/*hadoop*/mapred/*pipes*/ >> *WordCountInputFormat*.java, which has a very specific input split >> format. By running with a TextInputFormat, it will send binary bytes >> as the input split and won't work right. The *nopipe* example should >> probably be recoded *to* use libhdfs *too*, but that is more >> complicated *to* get running as a unit test. Also note that since the >> C++ example is using local file reads, it will only work on a cluster >> if you have nfs or something working across the cluster. " >> >> I would need some more light on the above explanation , so if anyone >> could elaborate a bit about the same as what needs to be done exactly. >> To mention, I am trying to run a sample KMeans algorithm on a GPU >> using Hadoop. >> >> Thanks in advance. >> >> Regards, >> Indrashish. >> >> On Thu, 31 Oct 2013 20:00:10 -0400, Basu,Indrashish wrote: >>> Hi, >>> >>> I am trying to run a sample Hadoop GPU source code (kmeans algorithm) >>> on an ARM processor and getting the below error. Can anyone please >>> throw some light on this ? >>> >>> rmr: cannot remove output: No such file or directory. >>> 13/10/31 13:43:12 WARN mapred.JobClient: No job jar file set. User >>> classes may not be found. See JobConf(Class) or >>> JobConf#setJar(String). >>> 13/10/31 13:43:12 INFO mapred.FileInputFormat: Total input paths to >>> process : 1 >>> 13/10/31 13:43:13 INFO mapred.JobClient: Running job: job_201310311320_0001 >>> 13/10/31 13:43:14 INFO mapred.JobClient: map 0% reduce 0% >>> 13/10/31 13:43:39 INFO mapred.JobClient: Task Id : >>> attempt_201310311320_0001_m_000000_0, Status : FAILED >>> java.io.IOException: pipe child exception >>> at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:191) >>> at >>> org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:103) >>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:363) >>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) >>> at org.apache.hadoop.mapred.Child.main(Child.java:170) >>> Caused by: java.net.SocketException: Broken pipe >>> at java.net.SocketOutputStream.socketWrite0(Native Method) >>> at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) >>> at java.net.SocketOutputStream.write(SocketOutputStream.java:159) >>> at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) >>> at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126) >>> at java.io.DataOutputStream.write(DataOutputStream.java:107) >>> at >>> org.apache.hadoop.mapred.pipes.BinaryProtocol.writeObject(BinaryProtocol.java:333) >>> at >>> org.apache.hadoop.mapred.pipes.BinaryProtocol.mapItem(BinaryProtocol.java:286) >>> at >>> org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:92) >>> ... 3 more >>> >>> attempt_201310311320_0001_m_000000_0: cmd: [bash, -c, exec >>> '/app/hadoop/tmp/mapred/local/taskTracker/archive/10.227.56.195bin/cpu-kmeans2D/cpu-kmeans2D' >>> '0' < /dev/null 1>> >>> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/attempt_201310311320_0001_m_000000_0/stdout >>> 2>> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/ >>> >>> Regards, > > -- > Indrashish Basu > Graduate Student > Department of Electrical and Computer Engineering > University of Florida -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You. --Apple-Mail=_585928A8-1C17-4001-979D-5C913C53D1AE Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=ISO-8859-1 It seems like your pipes m= apper is exiting before consuming all the input. Did you check the task-log= s on the web UI?

Thanks,
+Vinod

On Nov 5, 2013, at 7:25 AM, Basu,Indrashish wrote:


Hi,
Can anyone kindly assist on this ?

Regards,
Indrashish
<= br>
On Mon, 04 Nov 2013 10:23:23 -0500, Basu,Indrashish wrote:
Hi All,

<= /blockquote>
Any update on the below post ?

I came across some old post regarding the same issue. It explains the
solution as " The *nopipe* example= needs more documentation.  It
assumes that it is run with the InputFormat from
src/test/org/apache/*hadoop*/mapred/*pipes*/
*WordCountInputFormat*.java, which has a ver= y specific input split
format. By= running with a TextInputFormat, it will send binary bytes
=
as the input split and won't work right. The *nop= ipe* example should
probably be r= ecoded *to* use libhdfs *too*, but that is more
complicated *to* get running as a unit test. Also note that = since the
C++ example is using lo= cal file reads, it will only work on a cluster
if you have nfs or something working across the cluster. "

I would need some more light on the above explanation , so if any= one
could elaborate a bit about t= he same as what needs to be done exactly.
To mention, I am trying to run a sample KMeans algorithm on a GPU=
using Hadoop.

Thanks i= n advance.

Regards,
Ind= rashish.

On Thu, 31 Oct 2013 20:00:10 -0400, Basu,Indrashish wrot= e:
Hi,<= br>

I am trying to run a sample Hadoop GPU source code (kmeans alg= orithm)
on an ARM processor and getting the below error. Can anyone p= lease
throw some light on this ?

=
rmr: cannot remove outp= ut: No such file or directory.
13/10/31 13:43:12 WARN mapred.JobClien= t: No job jar file set.  User
classes may not be found. See JobC= onf(Class) or
JobConf#setJar(String).
13/10/31 13:43:12 INFO ma= pred.FileInputFormat: Total input paths to
process : 1
13/10/31= 13:43:13 INFO mapred.JobClient: Running job: job_201310311320_0001
1= 3/10/31 13:43:14 INFO mapred.JobClient:  map 0% reduce 0%
13/10/= 31 13:43:39 INFO mapred.JobClient: Task Id :
<= blockquote type=3D"cite">
attempt_201310311320_000= 1_m_000000_0, Status : FAILED
java.io.IOException: pipe child excepti= on
   at org.apache.hadoop.mapred.pipes.Application.= abort(Application.java:191)
   at
<= /blockquote>
org.apache.= hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:103)
&nbs= p;  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java= :363)
   at org.apache.hadoop.mapred.MapTask.run(MapT= ask.java:307)
   at org.apache.hadoop.mapred.Child.ma= in(Child.java:170)
<= blockquote type=3D"cite">Caused by: java.net.SocketException: Broken pipe
   at java.net.SocketOutputStream.socketWrite0(Native = Method)
   at java.net.SocketOutputStream.socketWrite= (SocketOutputStream.java:113)
   at java.net.SocketOu= tputStream.write(SocketOutputStream.java:159)
=
   at j= ava.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
   at java.io.BufferedOutputStream.write(BufferedOutputStr= eam.java:126)
   at java.io.DataOutputStream.write(Da= taOutputStream.java:107)
   at
org.apache.hadoo= p.mapred.pipes.BinaryProtocol.writeObject(BinaryProtocol.java:333)
&= nbsp;  at
=
org.apache.hadoop.mapred.pipes.BinaryProtocol.map= Item(BinaryProtocol.java:286)
   at
<= /blockquote>
org.apache.= hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:92)
 = ;  ... 3 more

attempt_201310311320_0001_m_000000_0: = cmd: [bash, -c, exec
'/app/hadoop/tmp/mapred/local/taskTracker/archiv= e/10.227.56.195bin/cpu-kmeans2D/cpu-kmeans2D'
=
'0'  < /dev/nul= l  1>>
/usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/use= rlogs/attempt_201310311320_0001_m_000000_0/stdout
2>> /usr/loca= l/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/

Regards,
<= /blockquote>

--
Indrashish Basu
Graduate Student
= Department of Electrical and Computer Engineering
University of Florida<= br>


CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u. --Apple-Mail=_585928A8-1C17-4001-979D-5C913C53D1AE--