hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Srikanth Sundarrajan (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HDFS-481) Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
Date Mon, 29 Mar 2010 10:44:27 GMT

    [ https://issues.apache.org/jira/browse/HDFS-481?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12850878#action_12850878
] 

Srikanth Sundarrajan commented on HDFS-481:
-------------------------------------------

{quote}
If it is not too hard, it would be great if you could separate the divide patch to some other
issues like HDFS-1009. In general, a JIRA should only fix one issue
{quote}

Proxy will not be fully functional and compatible with HDFS (kerberos based setup). Hence
moved the fixes for 1009 into this JIRA. Related changes are 

1. Inclusion of KerberosAuthorizationFilter which extends AuthorizationFilter
2. web.xml to include KerberosAuthorizationFilter instead of the default AuthorizationFilter

Have listed the changed files in the patch along with a brief summary of what the change is
meant for.

> Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
> --------------------------------------------------------------------
>
>                 Key: HDFS-481
>                 URL: https://issues.apache.org/jira/browse/HDFS-481
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: contrib/hdfsproxy
>    Affects Versions: 0.21.0
>            Reporter: zhiyong zhang
>            Assignee: Srikanth Sundarrajan
>         Attachments: HDFS-481-bp-y20.patch, HDFS-481-bp-y20s.patch, HDFS-481.out, HDFS-481.patch,
HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch
>
>
> Bugs:
> 1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
 
> If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's
build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version
will not be recognized. 
> 2. LdapIpDirFilter.java is not thread safe. userName, Group & Paths are per request
and can't be class members.
> 3. Addressed the following StackOverflowError. 
> ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
> ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
> java.lang.StackOverflowError
>         at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
> equest.java:229)
>      This is due to when the target war (/target.war) does not exist, the forwarding
war will forward to its parent context path /, which defines the forwarding war itself. This
cause infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName()
in the if logic to break the loop.
> 4. Kerberos credentials of remote user aren't available. HdfsProxy needs to act on behalf
of the real user to service the requests

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message