hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ben Kim <benkimkim...@gmail.com>
Subject Re: Streaming Job map/reduce not working with scripts on 1.0.3
Date Fri, 04 Jan 2013 11:57:41 GMT
nevermind. the problem has been fixed.

The problem was the trailing {control-v}{control-m} character on the first
line of #!/bin/bash
(which i blame my teammate for writing the script in windows notepad !!)





On Fri, Jan 4, 2013 at 8:09 PM, Ben Kim <benkimkimben@gmail.com> wrote:

> Hi !
>
> I'm using hadoop-1.0.3 to run streaming jobs with map/reduce shell scripts
> such as this
>
> bin/hadoop jar ./contrib/streaming/hadoop-streaming-1.0.3.jar -input
> /input -output /output/015 -mapper "streaming-map.sh" -reducer
> "streaming-reduce.sh" -file /home/hadoop/streaming/streaming-map.sh -file
> /home/hadoop/streaming-reduce.sh
>
> but the job fails and the task attemp log shows this,
>
> java.lang.RuntimeException: Error in configuring object
> 	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
> 	... 9 more
> Caused by: java.lang.RuntimeException: Error in configuring object
> 	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> 	at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
> 	... 14 more
> Caused by: java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
> 	... 17 more
> Caused by: java.lang.RuntimeException: configuration exception
> 	at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230)
> 	at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
> 	... 22 more
> Caused by: java.io.IOException: Cannot run program "/tmp/hadoop-hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/attempt_201301041944_0001_m_000000_0/work/./streaming-map.sh":
java.io.IOException: error=2, No such file or directory
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
> 	at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
> 	... 23 more
> Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
> 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
> 	at java.lang.ProcessImpl.start(ProcessImpl.java:65)
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
> 	... 24 more
>
>
> I tried to see what the problem is and found out that the missing file is
> a symbolic link and hadoop isn't able to create it, in fact the
> /tmp/hadoop-hadoop/...........00000_0/work directory doesn't exist at all.
>
>
> here's an exerpt from the task attempt syslog (full text attached):
>
> 2013-01-04 19:44:43,304 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager:
Creating symlink: /tmp/hadoop-hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/jars/streaming-map.sh
<- /tmp/hadoop-hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/attempt_201301041944_0001_m_000000_0/work/streaming-map.sh
>
>
> hadoop is thinking that it's successfully created the symbolic link from .....job_201301041944_0001/jars/streaming-map.s
to job_201301041944_0001/attempt_201301041944_0001_m_000000_0/work//streaming-map.s
>
> but it actually doesn't. Therefore throwing the error.
>
>
> If you had same experience or knows work around it please comment!
> otherwise i'll file a jira tomorrow for it seems to be an obvious bug.
>
>
> Best regards,
>
> *Benjamin Kim*
> *benkimkimben at gmail*
>



-- 

*Benjamin Kim*
*benkimkimben at gmail*

Mime
View raw message