incubator-oozie-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Grant Ingersoll <gsing...@apache.org>
Subject Re: Oozie Security/Impersonation
Date Fri, 20 Apr 2012 18:00:44 GMT
OK, that was it.  I had wildcards in for the proxy users the first time around.

Thanks for all the help,
Grant


On Apr 20, 2012, at 1:50 PM, Grant Ingersoll wrote:

> Hmm, so user XXX does have some files in HDFS, so that may be why it is in there after
all.  Retrying the proxy now.
> 
> 
> On Apr 20, 2012, at 1:39 PM, Grant Ingersoll wrote:
> 
>> I tried the proxy suggestions before (I had googled the error and got the Hadoop
Impersonation page), but I will try again.  I tried them in both hadoop and oozie, but likely
not both at the same time.  I did use wildcards, so perhaps I will go try w/ explicit values.
 That being said, I had pretty much this same setup using Hadoop 0.20.205 and Oozie 3.1.3
and didn't have this problem.
>> 
>> Here's some more info of things that look odd in my logs:
>> 
>> When I look at Hadoop logs, I see:
>> 
>> 2012-04-20 13:05:26,126 WARN org.apache.hadoop.security.ShellBasedUnixGroupsMapping:
got exception trying to get groups for user XXX
>> org.apache.hadoop.util.Shell$ExitCodeException: id: XXX: No such user
>> 
>> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>> 	at org.apache.hadoop.util.Shell.run(Shell.java:182)
>> 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
>> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
>> 	at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:68)
>> 	at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:45)
>> 	at org.apache.hadoop.security.Groups.getGroups(Groups.java:79)
>> 	at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:998)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.<init>(FSPermissionChecker.java:50)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5210)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:5193)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:2019)
>> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.getFileInfo(NameNode.java:848)
>> 	at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 	at java.lang.reflect.Method.invoke(Method.java:616)
>> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>> 	at java.security.AccessController.doPrivileged(Native Method)
>> 	at javax.security.auth.Subject.doAs(Subject.java:416)
>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
>> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>> 2012-04-20 13:05:26,127 WARN org.apache.hadoop.security.UserGroupInformation: No
groups available for user XXX
>> 2012-04-20 13:05:26,171 INFO org.apache.hadoop.ipc.Server: IPC Server listener on
54310: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Connection
from 10.0.0.23:56648 for protocol org.apache.hadoop.hdfs.protocol.ClientProtocol is unauthorized
for user hadoop via hadoop. Count of bytes read: 0
>> org.apache.hadoop.security.AccessControlException: Connection from 10.0.0.23:56648
for protocol org.apache.hadoop.hdfs.protocol.ClientProtocol is unauthorized for user hadoop
via hadoop
>> 	at org.apache.hadoop.ipc.Server$Connection.processOneRpc(Server.java:1287)
>> 	at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1182)
>> 	at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:537)
>> 	at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:344)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
>> 	at java.lang.Thread.run(Thread.java:679)
>> 
>> 
>> Which is just strange, b/c my application, running on a different machine, which
invokes Oozie by the OozieClient, is the only thing running as that user (where XXX is my
username on my machine).    The machine running Oozie and Hadoop doesn't even have a user
by that name.  In other words, somewhere, the client seems to be passing in that user name
(XXX)
>> 
>> I am creating OozieClient as:
>> ...
>> properties.setProperty("user.name", "hadoop");
>>   log.info("Workflow properties: {}", properties);
>>   try {
>>     String jobId = oozieClient.run(properties);
>> 
>> So, in other words, I'm explicitly setting the user.name and you can see that propagate
through Oozie.
>> 
>> I will keep digging.
>> 
>> 
>> On Apr 20, 2012, at 12:53 PM, Alejandro Abdelnur wrote:
>> 
>>> Grant,
>>> 
>>> You have to configure your Hadoop cluster with proxyuser for the hadoop
>>> user.
>>> 
>>> In the Hadoop core-site.xml files in your cluster (NN & JT), you have to
>>> add:
>>> 
>>> <!-- OOZIE -->
>>> <property>
>>>  <name>hadoop.proxyuser.OOZIE_SERVER_USER.hosts</name>
>>>  <value>OOZIE_HOSTNAME</value>
>>> </property>
>>> <property>
>>>  <name>hadoop.proxyuser.OOZIE_SERVER_USER.groups</name>
>>>  <value>USER_GROUPS_THAT_ALLOW_IMPERSONATION</value>
>>> </property>
>>> 
>>> you'll have to replace the capital letters sections with your specific
>>> values then you'll have to restart hadoop
>>> 
>>> Thxs.
>>> 
>>> Alejandro
>>> 
>>> On Fri, Apr 20, 2012 at 9:27 AM, Grant Ingersoll <gsingers@apache.org>wrote:
>>> 
>>>> Hi,
>>>> 
>>>> I'm trying to get 3.2.0-SNAPSHOT (trunk as of yesterday) to work with
>>>> Hadoop 1.0.2.  I've got it built, etc. and hooked in the libs for Hadoop.
>>>> However, when I go to submit a workflow, I get
>>>> 
>>>> 012-04-20 12:24:14,350 ERROR UserGroupInformation:1096 -
>>>> PriviledgedActionException as:hadoop via hadoop
>>>> cause:org.apache.hadoop.ipc.RemoteException: User: hadoop is not allowed
to
>>>> impersonate hadoop
>>>> 2012-04-20 12:24:14,351  INFO BaseJobServlet:539 - USER[-] GROUP[-]
>>>> TOKEN[-] APP[-] JOB[-] ACTION[-] AuthorizationException
>>>> org.apache.oozie.service.AuthorizationException: E0902: Exception occured:
>>>> [org.apache.hadoop.ipc.RemoteException: User: hadoop is not allowed to
>>>> impersonate hadoop]
>>>>     at
>>>> org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:360)
>>>>     at
>>>> org.apache.oozie.servlet.BaseJobServlet.checkAuthorizationForApp(BaseJobServlet.java:188)
>>>>     at
>>>> org.apache.oozie.servlet.BaseJobsServlet.doPost(BaseJobsServlet.java:92)
>>>>     at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
>>>>     at
>>>> org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:285)
>>>>     at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
>>>>     at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
>>>>     at
>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>>>>     at
>>>> org.apache.oozie.servlet.AuthFilter$2.doFilter(AuthFilter.java:126)
>>>>     at
>>>> org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:372)
>>>>     at org.apache.oozie.servlet.AuthFilter.doFilter(AuthFilter.java:131)
>>>> 
>>>> 
>>>> I am using the default oozie-site.xml.  I have simple authentication
>>>> turned on.  I have anonymous users turned on.  Moreover, as you can see by
>>>> the exception, I am running Oozie as the same user as I am running Hadoop.
>>>> I have tried uncommenting the proxy user in oozie-site.
>>>> 
>>>> Any thoughts on what I am missing?
>>>> 
>>>> Thanks,
>>>> Grant
>>>> 
>>>> PS: bin/oozie-setup.sh doesn't seem to support Hadoop 1.0.x yet, despite
>>>> the libraries being in hadooplibs.  The addtowar.sh script rejects the
>>>> version.
>>> 
>>> 
>>> 
>>> 
>>> -- 
>>> Alejandro
>> 
>> 


Mime
View raw message