hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yuanbo Liu (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (HADOOP-13119) Web UI error accessing links which need authorization when Kerberos
Date Fri, 04 Nov 2016 03:59:58 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-13119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15635132#comment-15635132
] 

Yuanbo Liu edited comment on HADOOP-13119 at 11/4/16 3:59 AM:
--------------------------------------------------------------

[~jeffreyr97]/[~eyang]
I've read through the implements of {{HttpServer2.java}} and some filters, here is my investigation
result.
!screenshot-1.png!
>From the picture, we can see that /logs access is also controlled by SPENGO filter(the
authentication in the filter chain is a SPENGO filter).
{{HttpServer2#initSpnego}} is confusing because this method is not working and also is not
the way SPENGO filter is added. The right steps of enabling SPENGO are here:
{code}
hadoop.http.authentication.simple.anonymous.allowed    false
hadoop.http.authentication.signature.secret.file       /etc/security/http_secret
hadoop.http.authentication.type         kerberos
hadoop.http.authentication.kerberos.keytab      /etc/security/keytabs/spnego.service.keytab
hadoop.http.authentication.kerberos.principal   HTTP/_HOST@EXAMPLE.COM
hadoop.http.filter.initializers org.apache.hadoop.security.AuthenticationFilterInitializer
hadoop.http.authentication.cookie.domain       EXAMPLE.COM
{code}
The SPENGO filter is added by the method {{HttpServer2#addFilter}}.

[~jeffreyr97] The reason why you cannot access {{/logs}} is that {{/logs}} doesn't only require
authentication but also require authorization by default. And authorization is controlled
by the the property *dfs.cluster.administrators*. The user knox succeeds in authentication
but fails in authorization. Adding the user knox to dfs.cluster.administrators is an expected
behavior because this configuration is used to control who can access the default servlets.
On the other hand, I love the idea that make SPENGO filter support proxy user. Proxy user
is a basic function in Hadoop, and SPENGO filter should support it. By the way, I need to
apologize that I mix the concepts of proxy user and delegation filter in the internal discussion,
they're quite different.

To the conclusion, I propose:
* Erasing {{HttpServer2#initSpnego}}. The code is useless and misleading.
* Extending the feature of {{org.apache.hadoop.security.AuthenticationFilter}} and making
SPENGO filter support proxy user by default.
* Deleting the redundant filter NoCacheFilter (see the pic) in the WebAppContext, adding NoCacheFilter
into the LogContext's filter chain.

[~zjshen]/[~atm]/[~daryn]/[~vinodkv], I tag you guys here since you contribute a lot of security
filters in Hadoop.
If you and people in the watching list have any thoughts about this JIRA, please let me know.
Thanks in advance.


was (Author: yuanbo):
[~jeffreyr97]/[~eyang]
I've read through the implements of {{HttpServer2.java}} and some filters, here is my investigation
result.
!screenshot-1.png!
>From the picture, we can see that /logs access is also controlled by SPENGO filter(the
authentication in the filter chain is a SPENGO filter).
{{HttpServer2#initSpnego}} is confusing because this method is not working and also is not
the way SPENGO filter is added. The right steps of enabling SPENGO are here:
{code}
hadoop.http.authentication.simple.anonymous.allowed    false
hadoop.http.authentication.signature.secret.file       /etc/security/http_secret
hadoop.http.authentication.type         kerberos
hadoop.http.authentication.kerberos.keytab      /etc/security/keytabs/spnego.service.keytab
hadoop.http.authentication.kerberos.principal   HTTP/_HOST@EXAMPLE.COM
hadoop.http.filter.initializers org.apache.hadoop.security.AuthenticationFilterInitializer
hadoop.http.authentication.cookie.domain       EXAMPLE.COM
{code}
The SPENGO filter is added by the method {{HttpServer2#addFilter}}.

[~jeffreyr97] The reason why you cannot access {{/logs}} is that {{/logs}} doesn't only require
authentication but also require authorization by default. And authorization is controlled
by the the property *dfs.cluster.administrators*. The user knox succeeds in authentication
but fails in authorization. Adding the user knox to dfs.cluster.administrators is an expected
behavior because this configuration is used to control who can access the default servlets.
On the other hand, I love the idea that make SPENGO filter support proxy user. Proxy user
is a basic function in Hadoop, and SPENGO filter should support it. By the way, I need to
apologize that I mix the concepts of proxy user and delegation filter in the internal discussion,
they're quite different.

To the conclusion, I propose:
* Erasing {{HttpServer2#initSpnego}}. The code is useless and misleading.
* Extending the feature of {{org.apache.hadoop.security.AuthenticationFilter}} and making
SPENGO filter support proxy user by default.
* Deleting the redundant filter(NoCacheFilter) in the WebAppContext, adding NoCacheFilter
into the LogContext's filter chain.

[~zjshen]/[~atm]/[~daryn]/[~vinodkv], I tag you guys here since you contribute a lot of security
filters in Hadoop.
If you and people in the watching list have any thoughts about this JIRA, please let me know.
Thanks in advance.

> Web UI error accessing links which need authorization when Kerberos
> -------------------------------------------------------------------
>
>                 Key: HADOOP-13119
>                 URL: https://issues.apache.org/jira/browse/HADOOP-13119
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.8.0, 2.7.4
>            Reporter: Jeffrey E  Rodriguez
>            Assignee: Yuanbo Liu
>              Labels: security
>         Attachments: screenshot-1.png
>
>
> User Hadoop on secure mode.
> login as kdc user, kinit.
> start firefox and enable Kerberos
> access http://localhost:50070/logs/
> Get 403 authorization errors.
> only hdfs user could access logs.
> Would expect as a user to be able to web interface logs link.
> Same results if using curl:
> curl -v  --negotiate -u tester:  http://localhost:50070/logs/
>  HTTP/1.1 403 User tester is unauthorized to access this page.
> so:
> 1. either don't show links if hdfs user  is able to access.
> 2. provide mechanism to add users to web application realm.
> 3. note that we are pass authentication so the issue is authorization to /logs/
> suspect that /logs/ path is secure in webdescriptor so suspect users by default don't
have access to secure paths.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


Mime
View raw message