hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Srikanth Sundarrajan (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HDFS-481) Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
Date Tue, 06 Apr 2010 17:47:33 GMT

    [ https://issues.apache.org/jira/browse/HDFS-481?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12854080#action_12854080

Srikanth Sundarrajan commented on HDFS-481:

Output from test-patch

     [exec] +1 overall.  
     [exec]     +1 @author.  The patch does not contain any @author tags.
     [exec]     +1 tests included.  The patch appears to include 15 new or modified tests.
     [exec]     +1 javadoc.  The javadoc tool did not generate any warning messages.
     [exec]     +1 javac.  The applied patch does not increase the total number of javac compiler
     [exec]     +1 findbugs.  The patch does not introduce any new Findbugs warnings.
     [exec]     +1 release audit.  The applied patch does not increase the total number of
release audit warnings.



> Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
> --------------------------------------------------------------------
>                 Key: HDFS-481
>                 URL: https://issues.apache.org/jira/browse/HDFS-481
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: contrib/hdfsproxy
>    Affects Versions: 0.21.0
>            Reporter: zhiyong zhang
>            Assignee: Srikanth Sundarrajan
>         Attachments: HDFS-481-bp-y20.patch, HDFS-481-bp-y20s.patch, HDFS-481.out, HDFS-481.patch,
HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch,
> Bugs:
> 1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
> If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's
build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version
will not be recognized. 
> 2. LdapIpDirFilter.java is not thread safe. userName, Group & Paths are per request
and can't be class members.
> 3. Addressed the following StackOverflowError. 
> ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
> ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
> java.lang.StackOverflowError
>         at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
> equest.java:229)
>      This is due to when the target war (/target.war) does not exist, the forwarding
war will forward to its parent context path /, which defines the forwarding war itself. This
cause infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName()
in the if logic to break the loop.
> 4. Kerberos credentials of remote user aren't available. HdfsProxy needs to act on behalf
of the real user to service the requests

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message