accumulo-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Michael Berman (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (ACCUMULO-1282) Monitor requires jumping through hadoop permissions hoops (and granting accumulo broad permissions)
Date Tue, 16 Apr 2013 21:07:16 GMT

     [ https://issues.apache.org/jira/browse/ACCUMULO-1282?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Michael Berman updated ACCUMULO-1282:
-------------------------------------

    Description: 
The monitor's master status box requires getContentSummary(new Path("/")) on HDFS (otherwise
see stack trace below).  There doesn't seem to be any way to grant this permission to a particular
user or change the permissions on /, so for this to work, you either need to run accumulo
and hdfs as the same user or add accumulo's user to hadoop's supergroup.  Either way, this
seems like it's granting unnecessarily broad permissions to accumulo.  Is there some other
way to get the disk usage information out of hadoop with normal user-level permissions?


Stack trace running as separate users without special permissions:
{code}
2013-04-16 20:34:57,770 [servlets.BasicServlet] DEBUG:  org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied: user=accumulo, ac
cess=READ_EXECUTE, inode="system":hadoop:supergroup:rwx------
org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException:
Permission denied: user=accumulo, access=READ_EXECUTE, inode="system":hadoop:supergroup:rwx-
-----
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
        at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)
        at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
        at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:1438)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getContentSummary(DistributedFileSystem.java:251)
        at org.apache.accumulo.server.trace.TraceFileSystem.getContentSummary(TraceFileSystem.java:312)
        at org.apache.accumulo.server.monitor.servlets.DefaultServlet.doAccumuloTable(DefaultServlet.java:317)
        at org.apache.accumulo.server.monitor.servlets.DefaultServlet.pageBody(DefaultServlet.java:256)
        at org.apache.accumulo.server.monitor.servlets.BasicServlet.doGet(BasicServlet.java:61)
        at org.apache.accumulo.server.monitor.servlets.DefaultServlet.doGet(DefaultServlet.java:157)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
        at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
        at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
        at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
        at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
        at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
        at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
        at org.mortbay.jetty.Server.handle(Server.java:326)
        at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
        at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
        at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
        at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException:
Permission denied: user=accumulo, access=READ_EXECUTE, inode="system":hadoop:supergroup:rwx--
----
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSubAccess(FSPermissionChecker.java:168)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:137)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5468)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getContentSummary(FSNamesystem.java:2225)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.getContentSummary(NameNode.java:986)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

        at org.apache.hadoop.ipc.Client.call(Client.java:1107)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
        at sun.proxy.$Proxy1.getContentSummary(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
        at sun.proxy.$Proxy1.getContentSummary(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:1436)
        ... 22 more
{code}

  was:
The monitor's master status box requires getContentSummary(new Path("/")) on HDFS (otherwise
see stack trace below).  There doesn't seem to be any way to grant this permission to a particular
user or change the permissions on /, so for this to work, you either need to run accumulo
and hdfs as the same user or add accumulo's user to hadoop's supergroup.  Either way, this
seems like it's granting unnecessarily broad permissions to accumulo.  Is there some other
way to get the disk usage information out of hadoop with normal user-level permissions?



{code}
2013-04-16 20:34:57,770 [servlets.BasicServlet] DEBUG:  org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied: user=accumulo, ac
cess=READ_EXECUTE, inode="system":hadoop:supergroup:rwx------
org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException:
Permission denied: user=accumulo, access=READ_EXECUTE, inode="system":hadoop:supergroup:rwx-
-----
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
        at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)
        at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
        at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:1438)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getContentSummary(DistributedFileSystem.java:251)
        at org.apache.accumulo.server.trace.TraceFileSystem.getContentSummary(TraceFileSystem.java:312)
        at org.apache.accumulo.server.monitor.servlets.DefaultServlet.doAccumuloTable(DefaultServlet.java:317)
        at org.apache.accumulo.server.monitor.servlets.DefaultServlet.pageBody(DefaultServlet.java:256)
        at org.apache.accumulo.server.monitor.servlets.BasicServlet.doGet(BasicServlet.java:61)
        at org.apache.accumulo.server.monitor.servlets.DefaultServlet.doGet(DefaultServlet.java:157)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
        at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
        at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
        at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
        at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
        at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
        at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
        at org.mortbay.jetty.Server.handle(Server.java:326)
        at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
        at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
        at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
        at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException:
Permission denied: user=accumulo, access=READ_EXECUTE, inode="system":hadoop:supergroup:rwx--
----
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSubAccess(FSPermissionChecker.java:168)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:137)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5468)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getContentSummary(FSNamesystem.java:2225)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.getContentSummary(NameNode.java:986)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

        at org.apache.hadoop.ipc.Client.call(Client.java:1107)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
        at sun.proxy.$Proxy1.getContentSummary(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
        at sun.proxy.$Proxy1.getContentSummary(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:1436)
        ... 22 more
{code}

    
> Monitor requires jumping through hadoop permissions hoops (and granting accumulo broad
permissions)
> ---------------------------------------------------------------------------------------------------
>
>                 Key: ACCUMULO-1282
>                 URL: https://issues.apache.org/jira/browse/ACCUMULO-1282
>             Project: Accumulo
>          Issue Type: Bug
>    Affects Versions: 1.5.0
>            Reporter: Michael Berman
>            Priority: Minor
>
> The monitor's master status box requires getContentSummary(new Path("/")) on HDFS (otherwise
see stack trace below).  There doesn't seem to be any way to grant this permission to a particular
user or change the permissions on /, so for this to work, you either need to run accumulo
and hdfs as the same user or add accumulo's user to hadoop's supergroup.  Either way, this
seems like it's granting unnecessarily broad permissions to accumulo.  Is there some other
way to get the disk usage information out of hadoop with normal user-level permissions?
> Stack trace running as separate users without special permissions:
> {code}
> 2013-04-16 20:34:57,770 [servlets.BasicServlet] DEBUG:  org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied: user=accumulo, ac
> cess=READ_EXECUTE, inode="system":hadoop:supergroup:rwx------
> org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException:
Permission denied: user=accumulo, access=READ_EXECUTE, inode="system":hadoop:supergroup:rwx-
> -----
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
>         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)
>         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
>         at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:1438)
>         at org.apache.hadoop.hdfs.DistributedFileSystem.getContentSummary(DistributedFileSystem.java:251)
>         at org.apache.accumulo.server.trace.TraceFileSystem.getContentSummary(TraceFileSystem.java:312)
>         at org.apache.accumulo.server.monitor.servlets.DefaultServlet.doAccumuloTable(DefaultServlet.java:317)
>         at org.apache.accumulo.server.monitor.servlets.DefaultServlet.pageBody(DefaultServlet.java:256)
>         at org.apache.accumulo.server.monitor.servlets.BasicServlet.doGet(BasicServlet.java:61)
>         at org.apache.accumulo.server.monitor.servlets.DefaultServlet.doGet(DefaultServlet.java:157)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
>         at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>         at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>         at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>         at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>         at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>         at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>         at org.mortbay.jetty.Server.handle(Server.java:326)
>         at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>         at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>         at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
>         at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
> Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException:
Permission denied: user=accumulo, access=READ_EXECUTE, inode="system":hadoop:supergroup:rwx--
> ----
>         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)
>         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSubAccess(FSPermissionChecker.java:168)
>         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:137)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5468)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getContentSummary(FSNamesystem.java:2225)
>         at org.apache.hadoop.hdfs.server.namenode.NameNode.getContentSummary(NameNode.java:986)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:616)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1107)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
>         at sun.proxy.$Proxy1.getContentSummary(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:616)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
>         at sun.proxy.$Proxy1.getContentSummary(Unknown Source)
>         at org.apache.hadoop.hdfs.DFSClient.getContentSummary(DFSClient.java:1436)
>         ... 22 more
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message