Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9BF35E519 for ; Fri, 4 Jan 2013 11:58:31 +0000 (UTC) Received: (qmail 41214 invoked by uid 500); 4 Jan 2013 11:58:30 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 41028 invoked by uid 500); 4 Jan 2013 11:58:29 -0000 Mailing-List: contact mapreduce-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: mapreduce-user@hadoop.apache.org Delivered-To: mailing list mapreduce-user@hadoop.apache.org Received: (qmail 41004 invoked by uid 99); 4 Jan 2013 11:58:28 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 Jan 2013 11:58:28 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of benkimkimben@gmail.com designates 209.85.216.43 as permitted sender) Received: from [209.85.216.43] (HELO mail-qa0-f43.google.com) (209.85.216.43) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 Jan 2013 11:58:22 +0000 Received: by mail-qa0-f43.google.com with SMTP id cr7so13229832qab.9 for ; Fri, 04 Jan 2013 03:58:01 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=uFCFpaM3pHtMjiJFjjf03DyJXvKCCKQJ2M1uXd5KUvw=; b=p7Z/5ShsHSVGrZBmIPb4AJoKH5vTFkVGBXDcZJ3so34OUIdvTH+UXvuemchvDDMzPc 6kcE/ZqYBqncy+CjJHnYIfJJE9Ar46hRFAYJ4uhfKVPOyBUAfGiJ1hBgmVFdOBY1rHgU fYo7gb+9aFEe0g1v+UgY4ZKD7q1nx9M97VI8TzyGuN/hoiosp4S6jLkcu2gwsFcz1pC/ eJVuNoR329d+Z6YDY4eZaaa8X/oXrJmjedLax9TOGuRUShiJ7M749e0nbGmb9LDHt0O9 O+ilWJGyS5T3xe5yALkg2XIMoqIza+5Nn74vqTklp+K32rXxiRBTYWjyO8wjPaFU698g IIJA== Received: by 10.224.52.68 with SMTP id h4mr28152352qag.17.1357300681299; Fri, 04 Jan 2013 03:58:01 -0800 (PST) MIME-Version: 1.0 Received: by 10.49.2.103 with HTTP; Fri, 4 Jan 2013 03:57:41 -0800 (PST) In-Reply-To: References: From: Ben Kim Date: Fri, 4 Jan 2013 20:57:41 +0900 Message-ID: Subject: Re: Streaming Job map/reduce not working with scripts on 1.0.3 To: mapreduce-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=20cf3074afa8f6552104d275308f X-Virus-Checked: Checked by ClamAV on apache.org --20cf3074afa8f6552104d275308f Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable nevermind. the problem has been fixed. The problem was the trailing {control-v}{control-m} character on the first line of #!/bin/bash (which i blame my teammate for writing the script in windows notepad !!) On Fri, Jan 4, 2013 at 8:09 PM, Ben Kim wrote: > Hi ! > > I'm using hadoop-1.0.3 to run streaming jobs with map/reduce shell script= s > such as this > > bin/hadoop jar ./contrib/streaming/hadoop-streaming-1.0.3.jar -input > /input -output /output/015 -mapper "streaming-map.sh" -reducer > "streaming-reduce.sh" -file /home/hadoop/streaming/streaming-map.sh -file > /home/hadoop/streaming-reduce.sh > > but the job fails and the task attemp log shows this, > > java.lang.RuntimeException: Error in configuring object > at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav= a:93) > at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:6= 4) > at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.ja= va:117) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma= tion.java:1121) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:39) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav= a:88) > ... 9 more > Caused by: java.lang.RuntimeException: Error in configuring object > at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav= a:93) > at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:6= 4) > at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.ja= va:117) > at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34) > ... 14 more > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:39) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav= a:88) > ... 17 more > Caused by: java.lang.RuntimeException: configuration exception > at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230) > at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66) > ... 22 more > Caused by: java.io.IOException: Cannot run program "/tmp/hadoop-hadoop/ma= pred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/attempt_201301= 041944_0001_m_000000_0/work/./streaming-map.sh": java.io.IOException: error= =3D2, No such file or directory > at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) > at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214) > ... 23 more > Caused by: java.io.IOException: java.io.IOException: error=3D2, No such f= ile or directory > at java.lang.UNIXProcess.(UNIXProcess.java:148) > at java.lang.ProcessImpl.start(ProcessImpl.java:65) > at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) > ... 24 more > > > I tried to see what the problem is and found out that the missing file is > a symbolic link and hadoop isn't able to create it, in fact the > /tmp/hadoop-hadoop/...........00000_0/work directory doesn't exist at all= . > > > here's an exerpt from the task attempt syslog (full text attached): > > 2013-01-04 19:44:43,304 INFO org.apache.hadoop.filecache.TrackerDistribut= edCacheManager: Creating symlink: /tmp/hadoop-hadoop/mapred/local/taskTrack= er/hadoop/jobcache/job_201301041944_0001/jars/streaming-map.sh <- /tmp/hado= op-hadoop/mapred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/at= tempt_201301041944_0001_m_000000_0/work/streaming-map.sh > > > hadoop is thinking that it's successfully created the symbolic link from = .....job_201301041944_0001/jars/streaming-map.s to job_201301041944_0001/at= tempt_201301041944_0001_m_000000_0/work//streaming-map.s > > but it actually doesn't. Therefore throwing the error. > > > If you had same experience or knows work around it please comment! > otherwise i'll file a jira tomorrow for it seems to be an obvious bug. > > > Best regards, > > *Benjamin Kim* > *benkimkimben at gmail* > --=20 *Benjamin Kim* *benkimkimben at gmail* --20cf3074afa8f6552104d275308f Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
nevermind. the problem has been fixed.

The problem was the trailing {control-v}{control-m} character on the fi= rst line of #!/bin/bash
(which i blame my teammate for writing the= script in windows notepad !!)





On Fri, Jan 4, 2013 at 8:09 PM, Ben Kim <benkimkimben@gmail.com= > wrote:
H= i !

I'm using hadoop-1.0.3 to run streaming jobs with map/= reduce shell scripts such as this

bin/hadoop jar ./contrib/streaming/hadoop-streaming-1.0.3.jar -input /i= nput -output /output/015 -mapper "streaming-map.sh" -reducer &quo= t;streaming-reduce.sh" -file /home/hadoop/streaming/streaming-map.sh -= file /home/hadoop/streaming-reduce.sh

but the job fails and the task attemp log shows this,

java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:= 93) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java= :117) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati= on.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.ja= va:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso= rImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:= 88) ... 9 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:= 93) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java= :117) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.ja= va:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso= rImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:= 88) ... 17 more Caused by: java.lang.RuntimeException: configuration exception at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230) at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66) ... 22 more Caused by: java.io.IOException: Cannot run program "/tmp/hadoop-hadoop= /mapred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/attempt_201= 301041944_0001_m_000000_0/work/./streaming-map.sh": java.io.IOExceptio= n: error=3D2, No such file or directory at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214) ... 23 more Caused by: java.io.IOException: java.io.IOException: error=3D2, No such fil= e or directory at java.lang.UNIXProcess.<init>(UNIXProcess.java:148) at java.lang.ProcessImpl.start(ProcessImpl.java:65) at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) ... 24 more
I tried to see what the problem is and found ou= t that the missing file is a symbolic link and hadoop isn't able to cre= ate it, in fact the /tmp/hadoop-hadoop/...........00000_0/work directory do= esn't exist at all.

here's an exerpt from the task attempt syslog (full=
 text attached):

2013-01-04 19:44:43,304 INFO org.apache.hadoop.file= cache.TrackerDistributedCacheManager: Creating symlink: /tmp/hadoop-hadoop/= mapred/local/taskTracker/hadoop/jobcache/job_201301041944_0001/jars/streami= ng-map.sh <- /tmp/hadoop-hadoop/mapred/local/taskTracker/hadoop/jobcache= /job_201301041944_0001/attempt_201301041944_0001_m_000000_0/work/streaming-= map.sh

hadoop is thinking that it's successfully created the sy=
mbolic link from .....job_201301041944_0001/jars/streaming-map.s to job_201=
301041944_0001/attempt_201301041944_0001_m_000000_0/work//streaming-map.s


but it actually doesn't. Therefore throwing the error.

If you had same experience or knows work around it please =
comment!
otherwise i'll file a jira tomorrow for it seems to be an o= bvious bug.

Best regards,

Benjamin Kim=
benkimkimben at gmail




--

Benjamin Kim
benkimkimben at gmail

--20cf3074afa8f6552104d275308f--