hbase-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ted Yu (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HBASE-17435) Call to preCommitStoreFile() hook encounters SaslException in secure deployment
Date Sat, 07 Jan 2017 09:42:58 GMT

     [ https://issues.apache.org/jira/browse/HBASE-17435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Ted Yu updated HBASE-17435:
---------------------------
    Description: 
[~romil.choksi] was testing bulk load in secure cluster where LoadIncrementalHFiles failed.

Looking at region server log, we saw the following:
{code}
        at org.apache.hadoop.hbase.client.ClientSmallReversedScanner.next(ClientSmallReversedScanner.java:185)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1257)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1163)
        at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:300)
        at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
        at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
        at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
        at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
        at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
        at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
        at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getBackupContexts(BackupSystemTable.java:540)
        at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getBackupHistory(BackupSystemTable.java:517)
        at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getTablesForBackupType(BackupSystemTable.java:589)
        at org.apache.hadoop.hbase.backup.BackupObserver.preCommitStoreFile(BackupObserver.java:89)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$61.call(RegionCoprocessorHost.java:1494)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1660)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1734)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1692)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preCommitStoreFile(RegionCoprocessorHost.java:1490)
        at org.apache.hadoop.hbase.regionserver.HRegion.bulkLoadHFiles(HRegion.java:5512)
        at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint$1.run(SecureBulkLoadEndpoint.java:293)
        at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint$1.run(SecureBulkLoadEndpoint.java:276)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:356)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1704)
        at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.secureBulkLoadHFiles(SecureBulkLoadEndpoint.java:276)
        at org.apache.hadoop.hbase.protobuf.generated.SecureBulkLoadProtos$SecureBulkLoadService.callMethod(SecureBulkLoadProtos.java:4721)
...
Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is
missing or invalid credentials. Consider 'kinit'.
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
        ... 16 more
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException:
No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE  credentials
failed! (null))]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
{code}
The cause was that in the coprocessor hook, security Token for accessing hbase was missing,
leading to the above SaslException.

This JIRA adds support for the case where the observer hook needs the HBase token/credential
since the hook is executed under the RPC request user context.

  was:
[~romil.choksi] was testing bulk load in secure cluster where LoadIncrementalHFiles failed.

Looking at region server log, we saw the following:
{code}
        at org.apache.hadoop.hbase.client.ClientSmallReversedScanner.next(ClientSmallReversedScanner.java:185)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1257)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1163)
        at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:300)
        at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
        at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
        at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
        at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
        at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
        at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
        at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getBackupContexts(BackupSystemTable.java:540)
        at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getBackupHistory(BackupSystemTable.java:517)
        at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getTablesForBackupType(BackupSystemTable.java:589)
        at org.apache.hadoop.hbase.backup.BackupObserver.preCommitStoreFile(BackupObserver.java:89)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$61.call(RegionCoprocessorHost.java:1494)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1660)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1734)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1692)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preCommitStoreFile(RegionCoprocessorHost.java:1490)
        at org.apache.hadoop.hbase.regionserver.HRegion.bulkLoadHFiles(HRegion.java:5512)
        at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint$1.run(SecureBulkLoadEndpoint.java:293)
        at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint$1.run(SecureBulkLoadEndpoint.java:276)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:356)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1704)
        at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.secureBulkLoadHFiles(SecureBulkLoadEndpoint.java:276)
        at org.apache.hadoop.hbase.protobuf.generated.SecureBulkLoadProtos$SecureBulkLoadService.callMethod(SecureBulkLoadProtos.java:4721)
...
Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is
missing or invalid credentials. Consider 'kinit'.
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
        ... 16 more
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException:
No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE  credentials
failed! (null))]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
        at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
{code}
The cause was that in the coprocessor hook, security Token for accessing hbase was missing,
leading to the above SaslException.


> Call to preCommitStoreFile() hook encounters SaslException in secure deployment
> -------------------------------------------------------------------------------
>
>                 Key: HBASE-17435
>                 URL: https://issues.apache.org/jira/browse/HBASE-17435
>             Project: HBase
>          Issue Type: Bug
>            Reporter: Romil Choksi
>            Assignee: Ted Yu
>             Fix For: 2.0.0, 1.4.0
>
>         Attachments: 17435.v1.txt
>
>
> [~romil.choksi] was testing bulk load in secure cluster where LoadIncrementalHFiles failed.
> Looking at region server log, we saw the following:
> {code}
>         at org.apache.hadoop.hbase.client.ClientSmallReversedScanner.next(ClientSmallReversedScanner.java:185)
>         at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1257)
>         at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1163)
>         at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:300)
>         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
>         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
>         at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
>         at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
>         at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
>         at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
>         at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
>         at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getBackupContexts(BackupSystemTable.java:540)
>         at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getBackupHistory(BackupSystemTable.java:517)
>         at org.apache.hadoop.hbase.backup.impl.BackupSystemTable.getTablesForBackupType(BackupSystemTable.java:589)
>         at org.apache.hadoop.hbase.backup.BackupObserver.preCommitStoreFile(BackupObserver.java:89)
>         at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$61.call(RegionCoprocessorHost.java:1494)
>         at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1660)
>         at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1734)
>         at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1692)
>         at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preCommitStoreFile(RegionCoprocessorHost.java:1490)
>         at org.apache.hadoop.hbase.regionserver.HRegion.bulkLoadHFiles(HRegion.java:5512)
>         at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint$1.run(SecureBulkLoadEndpoint.java:293)
>         at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint$1.run(SecureBulkLoadEndpoint.java:276)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:356)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1704)
>         at org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint.secureBulkLoadHFiles(SecureBulkLoadEndpoint.java:276)
>         at org.apache.hadoop.hbase.protobuf.generated.SecureBulkLoadProtos$SecureBulkLoadService.callMethod(SecureBulkLoadProtos.java:4721)
> ...
> Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause
is missing or invalid credentials. Consider 'kinit'.
>         at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:679)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
>         at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:637)
>         at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:745)
>         ... 16 more
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException:
No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE  credentials
failed! (null))]
>         at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>         at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
>         at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:611)
>         at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:156)
>         at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:737)
> {code}
> The cause was that in the coprocessor hook, security Token for accessing hbase was missing,
leading to the above SaslException.
> This JIRA adds support for the case where the observer hook needs the HBase token/credential
since the hook is executed under the RPC request user context.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message