hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aaron Fabbri (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (HADOOP-14821) Executing the command 'hdfs -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks' fails if permission is denied to some files
Date Fri, 01 Sep 2017 18:13:00 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-14821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16150952#comment-16150952
] 

Aaron Fabbri edited comment on HADOOP-14821 at 9/1/17 6:12 PM:
---------------------------------------------------------------

Edit: I see what you are saying: Failure to access one file causes it to fail even if subsequent
files to allow access.

Also, I thought S3A support for credential providers landed around hadoop 2.8?  (HADOOP-12548)
 Not sure this will work in 2.7.x even when you get past the permission issue.  Maybe it was
backported?


was (Author: fabbri):
It looks like a permissions issue on your .jceks file: it is not readable by the user your
are running the hadoop command as, right?

Also, I thought S3A support for credential providers landed around hadoop 2.8?  (HADOOP-12548)
 Not sure this will work in 2.7.x even when you get past the permission issue.

> Executing the command 'hdfs -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks'
fails if permission is denied to some files
> -------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-14821
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14821
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3, hdfs-client, security
>    Affects Versions: 2.7.3
>         Environment: hadoop-common-2.7.3.2.6.0.11-1
>            Reporter: Ernani Pereira de Mattos Junior
>            Priority: Critical
>              Labels: features
>
> ======= 
> Request Use Case: 
> UC1: 
> The customer has the path to a directory and subdirectories full of keys. The customer
knows that he does not have the access to all the keys, but ignoring this problem, the customer
makes a list of the keys. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the list. If the
access is granted locally then he can try the login on the s3a. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the list. If the
access is not granted locally then he will skip the login on the s3a and try the next key
on the list. 
> ===========
> For now, the UC1.2 fails with below exception and does not try the next key:
> {code}
> $ hdfs  --loglevel DEBUG dfs -Dhadoop.security.credential.provider.path=jceks://hdfs/tmp/aws.jceks,jceks://hdfs/tmp/awst.jceks
-ls s3a://av-dl-hwx-nprod-anhffpoc-enriched/hive/e_ceod/
> Not retrying because try once and fail.
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
Permission denied: user=502549376, access=READ, inode="/tmp/aws.jceks":admin:hdfs:-rwx------
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


Mime
View raw message