hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Basu,Indrashish" <indrash...@ufl.edu>
Subject Re: Error while running Hadoop Source Code
Date Mon, 04 Nov 2013 15:23:23 GMT

Hi All,

Any update on the below post ?

I came across some old post regarding the same issue. It explains the 
solution as " The *nopipe* example needs more documentation.  It assumes 
that it is run with the InputFormat from 
src/test/org/apache/*hadoop*/mapred/*pipes*/ 
*WordCountInputFormat*.java, which has a very specific input split 
format. By running with a TextInputFormat, it will send binary bytes as 
the input split and won't work right. The *nopipe* example should 
probably be recoded *to* use libhdfs *too*, but that is more complicated 
*to* get running as a unit test. Also note that since the C++ example is 
using local file reads, it will only work on a cluster if you have nfs 
or something working across the cluster. "

I would need some more light on the above explanation , so if anyone 
could elaborate a bit about the same as what needs to be done exactly. 
To mention, I am trying to run a sample KMeans algorithm on a GPU using 
Hadoop.

Thanks in advance.

Regards,
Indrashish.

On Thu, 31 Oct 2013 20:00:10 -0400, Basu,Indrashish wrote:
> Hi,
>
> I am trying to run a sample Hadoop GPU source code (kmeans algorithm)
> on an ARM processor and getting the below error. Can anyone please
> throw some light on this ?
>
> rmr: cannot remove output: No such file or directory.
> 13/10/31 13:43:12 WARN mapred.JobClient: No job jar file set.  User
> classes may not be found. See JobConf(Class) or
> JobConf#setJar(String).
> 13/10/31 13:43:12 INFO mapred.FileInputFormat: Total input paths to
> process : 1
> 13/10/31 13:43:13 INFO mapred.JobClient: Running job: 
> job_201310311320_0001
> 13/10/31 13:43:14 INFO mapred.JobClient:  map 0% reduce 0%
> 13/10/31 13:43:39 INFO mapred.JobClient: Task Id :
> attempt_201310311320_0001_m_000000_0, Status : FAILED
> java.io.IOException: pipe child exception
>     at 
> org.apache.hadoop.mapred.pipes.Application.abort(Application.java:191)
>     at
> 
> org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:103)
>     at 
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:363)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>     at org.apache.hadoop.mapred.Child.main(Child.java:170)
> Caused by: java.net.SocketException: Broken pipe
>     at java.net.SocketOutputStream.socketWrite0(Native Method)
>     at 
> java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113)
>     at java.net.SocketOutputStream.write(SocketOutputStream.java:159)
>     at 
> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
>     at 
> java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
>     at java.io.DataOutputStream.write(DataOutputStream.java:107)
>     at
> 
> org.apache.hadoop.mapred.pipes.BinaryProtocol.writeObject(BinaryProtocol.java:333)
>     at
> 
> org.apache.hadoop.mapred.pipes.BinaryProtocol.mapItem(BinaryProtocol.java:286)
>     at
> 
> org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:92)
>     ... 3 more
>
> attempt_201310311320_0001_m_000000_0: cmd: [bash, -c, exec
> 
> '/app/hadoop/tmp/mapred/local/taskTracker/archive/10.227.56.195bin/cpu-kmeans2D/cpu-kmeans2D'
> '0'  < /dev/null  1>>
> 
> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/attempt_201310311320_0001_m_000000_0/stdout
> 2>> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/
>
> Regards,

-- 
Indrashish Basu
Graduate Student
Department of Electrical and Computer Engineering
University of Florida

Mime
View raw message