Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 550E610FFA for ; Tue, 5 Nov 2013 15:26:15 +0000 (UTC) Received: (qmail 62780 invoked by uid 500); 5 Nov 2013 15:26:10 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 61853 invoked by uid 500); 5 Nov 2013 15:26:05 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 61149 invoked by uid 99); 5 Nov 2013 15:26:01 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Nov 2013 15:26:01 +0000 X-ASF-Spam-Status: No, hits=-2.3 required=5.0 tests=RCVD_IN_DNSWL_MED,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of indrashish@ufl.edu designates 128.227.74.217 as permitted sender) Received: from [128.227.74.217] (HELO smtp.ufl.edu) (128.227.74.217) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Nov 2013 15:25:53 +0000 X-UFL-GatorLink-Authenticated: authenticated as glwm-service () with LOGIN from 10.241.70.188 Received: from webmail.ufl.edu (webmail-prod04.osg.ufl.edu [10.241.70.188]) (authenticated bits=0) by smtp.ufl.edu (8.13.8/8.13.8/3.0.0) with ESMTP id rA5FPUYh007240 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=NOT) for ; Tue, 5 Nov 2013 10:25:31 -0500 Received: from [216.155.123.169] by webmail.ufl.edu with HTTP (HTTP/1.1 POST); Tue, 05 Nov 2013 10:25:30 -0500 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 7bit Date: Tue, 05 Nov 2013 10:25:30 -0500 From: "Basu,Indrashish" To: User Subject: Re: Error while running Hadoop Source Code In-Reply-To: References: <2dc251abf0d49b28fcec6fedf5ba55ca@ufl.edu> Message-ID: <5d91d5bc1ed9e19b20a91c68ec239087@ufl.edu> X-Sender: indrashish@ufl.edu (indrashish) User-Agent: Roundcube Webmail/0.5.2 X-Proofpoint-Virus-Version: vendor=fsecure engine=2.50.10432:5.10.8794,1.0.431,0.0.0000 definitions=2013-11-05_06:2013-11-05,2013-11-05,1970-01-01 signatures=0 X-Proofpoint-Spam-Details: rule=notspam policy=default score=0 spamscore=0 suspectscore=0 phishscore=0 adultscore=0 bulkscore=0 classifier=spam adjust=0 reason=mlx scancount=1 engine=7.0.1-1305240000 definitions=main-1311050094 X-Spam-Level: * X-UFL-Spam-Level: * X-Virus-Checked: Checked by ClamAV on apache.org Hi, Can anyone kindly assist on this ? Regards, Indrashish On Mon, 04 Nov 2013 10:23:23 -0500, Basu,Indrashish wrote: > Hi All, > > Any update on the below post ? > > I came across some old post regarding the same issue. It explains the > solution as " The *nopipe* example needs more documentation. It > assumes that it is run with the InputFormat from > src/test/org/apache/*hadoop*/mapred/*pipes*/ > *WordCountInputFormat*.java, which has a very specific input split > format. By running with a TextInputFormat, it will send binary bytes > as the input split and won't work right. The *nopipe* example should > probably be recoded *to* use libhdfs *too*, but that is more > complicated *to* get running as a unit test. Also note that since the > C++ example is using local file reads, it will only work on a cluster > if you have nfs or something working across the cluster. " > > I would need some more light on the above explanation , so if anyone > could elaborate a bit about the same as what needs to be done > exactly. > To mention, I am trying to run a sample KMeans algorithm on a GPU > using Hadoop. > > Thanks in advance. > > Regards, > Indrashish. > > On Thu, 31 Oct 2013 20:00:10 -0400, Basu,Indrashish wrote: >> Hi, >> >> I am trying to run a sample Hadoop GPU source code (kmeans >> algorithm) >> on an ARM processor and getting the below error. Can anyone please >> throw some light on this ? >> >> rmr: cannot remove output: No such file or directory. >> 13/10/31 13:43:12 WARN mapred.JobClient: No job jar file set. User >> classes may not be found. See JobConf(Class) or >> JobConf#setJar(String). >> 13/10/31 13:43:12 INFO mapred.FileInputFormat: Total input paths to >> process : 1 >> 13/10/31 13:43:13 INFO mapred.JobClient: Running job: >> job_201310311320_0001 >> 13/10/31 13:43:14 INFO mapred.JobClient: map 0% reduce 0% >> 13/10/31 13:43:39 INFO mapred.JobClient: Task Id : >> attempt_201310311320_0001_m_000000_0, Status : FAILED >> java.io.IOException: pipe child exception >> at >> org.apache.hadoop.mapred.pipes.Application.abort(Application.java:191) >> at >> >> org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:103) >> at >> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:363) >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) >> at org.apache.hadoop.mapred.Child.main(Child.java:170) >> Caused by: java.net.SocketException: Broken pipe >> at java.net.SocketOutputStream.socketWrite0(Native Method) >> at >> java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) >> at >> java.net.SocketOutputStream.write(SocketOutputStream.java:159) >> at >> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) >> at >> java.io.BufferedOutputStream.write(BufferedOutputStream.java:126) >> at java.io.DataOutputStream.write(DataOutputStream.java:107) >> at >> >> org.apache.hadoop.mapred.pipes.BinaryProtocol.writeObject(BinaryProtocol.java:333) >> at >> >> org.apache.hadoop.mapred.pipes.BinaryProtocol.mapItem(BinaryProtocol.java:286) >> at >> >> org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:92) >> ... 3 more >> >> attempt_201310311320_0001_m_000000_0: cmd: [bash, -c, exec >> >> '/app/hadoop/tmp/mapred/local/taskTracker/archive/10.227.56.195bin/cpu-kmeans2D/cpu-kmeans2D' >> '0' < /dev/null 1>> >> >> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/attempt_201310311320_0001_m_000000_0/stdout >> 2>> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/ >> >> Regards, -- Indrashish Basu Graduate Student Department of Electrical and Computer Engineering University of Florida