hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shi Yu <sh...@uchicago.edu>
Subject Re: Hadoop Streaming job Fails - Permission Denied error
Date Wed, 14 Sep 2011 13:50:35 GMT
Just a quick ask, have you tried to switch off the dfs permission check 
in the hdfs-site.xml file?

<property>
<name>dfs.permissions</name>
<value>false</value>
</property>


Shi

On 9/14/2011 8:29 AM, Brock Noland wrote:
> Hi,
>
> This probably belongs on mapreduce-user as opposed to common-user. I
> have BCC'ed the common-user group.
>
> Generally it's a best practice to ship the scripts with the job. Like so:
>
> hadoop  jar
> /usr/lib/hadoop-0.20/contrib/streaming/hadoop-streaming-0.20.2-cdh3u0.jar
> -input /userdata/bejoy/apps/wc/input -output /userdata/bejoy/apps/wc/output
> -mapper WcStreamMap.py  -reducer WcStreamReduce.py
> -file /home/cloudera/bejoy/apps/inputs/wc/WcStreamMap.py
> -file /home/cloudera/bejoy/apps/inputs/wc/WcStreamReduce.py
>
> Brock
>
> On Mon, Sep 12, 2011 at 4:18 AM, Bejoy KS<bejoy.hadoop@gmail.com>  wrote:
>> Hi
>>       I wanted to try out hadoop steaming and got the sample python code for
>> mapper and reducer. I copied both into my lfs and tried running the steaming
>> job as mention in the documentation.
>> Here the command i used to run the job
>>
>> hadoop  jar
>> /usr/lib/hadoop-0.20/contrib/streaming/hadoop-streaming-0.20.2-cdh3u0.jar
>> -input /userdata/bejoy/apps/wc/input -output /userdata/bejoy/apps/wc/output
>> -mapper /home/cloudera/bejoy/apps/inputs/wc/WcStreamMap.py  -reducer
>> /home/cloudera/bejoy/apps/inputs/wc/WcStreamReduce.py
>>
>> Here other than input and output the rest all are on lfs locations. How ever
>> the job is failing. The error log from the jobtracker url is as
>>
>> java.lang.RuntimeException: Error in configuring object
>>     at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>>     at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>>     at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:386)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:324)
>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:396)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> Caused by: java.lang.reflect.InvocationTargetException
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>     at java.lang.reflect.Method.invoke(Method.java:597)
>>     at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
>>     ... 9 more
>> Caused by: java.lang.RuntimeException: Error in configuring object
>>     at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>>     at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>>     at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>>     at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
>>     ... 14 more
>> Caused by: java.lang.reflect.InvocationTargetException
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>     at java.lang.reflect.Method.invoke(Method.java:597)
>>     at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
>>     ... 17 more
>> Caused by: java.lang.RuntimeException: configuration exception
>>     at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230)
>>     at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
>>     ... 22 more
>> Caused by: java.io.IOException: Cannot run program
>> "/home/cloudera/bejoy/apps/inputs/wc/WcStreamMap.py": java.io.IOException:
>> error=13, Permission denied
>>     at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
>>     at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
>>     ... 23 more
>> Caused by: java.io.IOException: java.io.IOException: error=13, Permission
>> denied
>>     at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
>>     at java.lang.ProcessImpl.start(ProcessImpl.java:65)
>>     at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
>>     ... 24 more
>>
>> On the error I checked the permissions of mapper and reducer. Issued a chmod
>> 777 command as well. Still no luck.
>>
>> The permission of the files are as follows
>> cloudera@cloudera-vm:~$ ls -l /home/cloudera/bejoy/apps/inputs/wc/
>> -rwxrwxrwx 1 cloudera cloudera  707 2011-09-11 23:42 WcStreamMap.py
>> -rwxrwxrwx 1 cloudera cloudera 1077 2011-09-11 23:42 WcStreamReduce.py
>>
>> I'm testing the same on Cloudera Demo VM. So the hadoop setup would be on
>> pseudo distributed mode. Any help would be highly appreciated.
>>
>> Thank You
>>
>> Regards
>> Bejoy.K.S
>>



Mime
View raw message