hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: problems for hbase bulkload
Date Tue, 10 Jun 2014 02:59:32 GMT
What user did you use for the secure bulk load ?
See the steps outlined in the javadoc of SecureBulkLoadEndpoint (first two
steps copied below):

 * 1. Create an hbase owned staging directory which is

 * world traversable (711): /hbase/staging

 * 2. A user writes out data to his secure output directory: /user/foo/data

BTW there have been several fixes for secure bulk load since 0.96.1.1 - you
may want to try a newer release.

You can find the path which caused permission problem in namenode (audit)
log.

Cheers


On Mon, Jun 9, 2014 at 7:14 PM, jhaobull <jhaobull@gmail.com> wrote:

> hi, everyone :
>
>
> Our hbase version is 0.96.1.1-hadoop2 ,and we use kerberos for security.
>
>
> but when I use bulkload . that code runs to the location
>  LoadIncrementalHFiles(line 307):
>
>
> ....
> new SecureBulkLoadClient(table).cleanupBulkLoad(bulkToken);
> ....
>
>
> throws exception as below:
>
>
>
>
> Does anyone have ideas how to resolve it ,thks!
>
>
> -------
>
>
> 2014-06-09 13:17:16,876 WARN [htable-pool1-t22] ipc.CoprocessorRpcChannel
> (CoprocessorRpcChannel.java:callMethod(53)) - Call failed on IOException
>  org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> attempts=1, exceptions:
>  Mon Jun 09 13:17:16 CST 2014,
> org.apache.hadoop.hbase.client.RpcRetryingCaller@1200d083,
> org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:180)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:170)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5185)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5167)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:5123)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermissionInt(FSNamesystem.java:1338)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1317)
>   at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:528)
>   at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:348)
>   at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59576)
>   at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>   at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:415)
>   at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>   at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>   at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>   at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
>   at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:2047)
>   at
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1248)
>   at
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1244)
>   at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>   at
> org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:1244)
>   at
> org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.createStagingDir(SecureBulkLoadEndpoint.java:282)
>   at
> org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.cleanupBulkLoad(SecureBulkLoadEndpoint.java:172)
>   at
> org.apache.hadoop.hbase.protobuf.generated.SecureBulkLoadProtos$SecureBulkLoadService.callMethod(SecureBulkLoadProtos.java:4636)
>   at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:5102)
>   at
> org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3198)
>   at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:26933)
>   at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2175)
>   at org.apache.hadoop.hbase.ipc.RpcServer$Handler.run(RpcServer.java:1879)
>  Caused by:
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
> Permission denied
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:180)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:170)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5185)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5167)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:5123)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermissionInt(FSNamesystem.java:1338)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1317)
>   at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:528)
>   at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:348)
>   at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59576)
>   at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>   at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:415)
>   at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>   at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>   at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>   at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>   at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>   at com.sun.proxy.$Proxy13.setPermission(Unknown Source)
>   at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
>   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>   at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>   at com.sun.proxy.$Proxy13.setPermission(Unknown Source)
>   at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setPermission(ClientNamenodeProtocolTranslatorPB.java:277)
>   at sun.reflect.GeneratedMethodAccessor31.invoke(Unknown Source)
>   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:266)
>   at com.sun.proxy.$Proxy14.setPermission(Unknown Source)
>   at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:2045)
>   ... 12 more
>
>
>   at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:134)
>   at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:96)
>   at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:90)
>   at
> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:51)
>   at
> org.apache.hadoop.hbase.protobuf.generated.SecureBulkLoadProtos$SecureBulkLoadService$Stub.cleanupBulkLoad(SecureBulkLoadProtos.java:4735)
>   at
> org.apache.hadoop.hbase.client.coprocessor.SecureBulkLoadClient$2.call(SecureBulkLoadClient.java:109)
>   at
> org.apache.hadoop.hbase.client.coprocessor.SecureBulkLoadClient$2.call(SecureBulkLoadClient.java:96)
>   at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1484)
>   at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>   at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>   at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>   at java.lang.Thread.run(Thread.java:662)
>  Caused by: org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:180)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:170)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5185)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5167)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:5123)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermissionInt(FSNamesystem.java:1338)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1317)
>   at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:528)
>   at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:348)
>   at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59576)
>   at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>   at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:415)
>   at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>   at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>   at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>   at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
>   at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:2047)
>   at
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1248)
>   at
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1244)
>   at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>   at
> org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:1244)
>   at
> org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.createStagingDir(SecureBulkLoadEndpoint.java:282)
>   at
> org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.cleanupBulkLoad(SecureBulkLoadEndpoint.java:172)
>   at
> org.apache.hadoop.hbase.protobuf.generated.SecureBulkLoadProtos$SecureBulkLoadService.callMethod(SecureBulkLoadProtos.java:4636)
>   at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:5102)
>   at
> org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3198)
>   at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:26933)
>   at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2175)
>   at org.apache.hadoop.hbase.ipc.RpcServer$Handler.run(RpcServer.java:1879)
>  Caused by:
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
> Permission denied
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:180)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:170)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5185)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5167)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:5123)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermissionInt(FSNamesystem.java:1338)
>   at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1317)
>   at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:528)
>   at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:348)
>   at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59576)
>   at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>   at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:415)
>   at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>   at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>   at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>   at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>   at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>   at com.sun.proxy.$Proxy13.setPermission(Unknown Source)
>   at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
>   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>   at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>   at com.sun.proxy.$Proxy13.setPermission(Unknown Source)
>   at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setPermission(ClientNamenodeProtocolTranslatorPB.java:277)
>   at sun.reflect.GeneratedMethodAccessor31.invoke(Unknown Source)
>   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:606)
>   at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:266)
>   at com.sun.proxy.$Proxy14.setPermission(Unknown Source)
>   at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:2045)
>   ... 12 more
>
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>   at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>   at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>   at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:277)
>   at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1448)
>   at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:87)
>   at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:84)
>   at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:120)

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message