hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Srikanth Sundarrajan (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HDFS-481) Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
Date Mon, 29 Mar 2010 10:36:27 GMT

     [ https://issues.apache.org/jira/browse/HDFS-481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Srikanth Sundarrajan updated HDFS-481:
--------------------------------------

    Description: 
Bugs:

1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
 
If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's
build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version
will not be recognized. 

2. LdapIpDirFilter.java is not thread safe. userName, Group & Paths are per request and
can't be class members.

3. Addressed the following StackOverflowError. 
ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
java.lang.StackOverflowError
        at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
equest.java:229)
     This is due to when the target war (/target.war) does not exist, the forwarding war will
forward to its parent context path /, which defines the forwarding war itself. This cause
infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName() in the
if logic to break the loop.

4. Kerberos credentials of remote user aren't available. HdfsProxy needs to act on behalf
of the real user to service the requests



  was:
1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
 
If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's
build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version
will not be recognized. 

2. ssl.client.do.not.authenticate.server setting can only be set by hdfs's configuration files,
need to move this setting to ssl-client.xml.

3.  Solve some race conditions for LdapIpDirFilter.java. (userId, groupName, and paths need
to be moved to doFilter() instead of as class members

4. Addressed the following StackOverflowError. 
ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
java.lang.StackOverflowError
        at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
equest.java:229)
     This is due to when the target war (/target.war) does not exist, the forwarding war will
forward to its parent context path /, which defines the forwarding war itself. This cause
infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName() in the
if logic to break the loop.




        Summary: Bug Fixes + HdfsProxy to use proxy user to impresonate the real user  (was:
Bug Fixes)

Summary of Changes:

1. ProxyFileDataServlet, ProxyListPathsServlet, ProxyFileForward - Use createProxyUser instead
of createRemoteUser to obtain UGI for the requesting user, name.conf - context attribute is
set by LdapIpDirFilter

2. LdapIpDirFilter - Removed class members userId, groupName and Paths and these are now set
for each request through LdapEntry (a private inner class)

3. KerberosAuthorizationFilter - Accessing proxy user keytab file for credentials and initializing
UGI

4. LdapIpDirFilter + AuthorizationFilter -  Separated IP based authentication and path authorization
into two independent filters. IP based authentication is done by LdapIpDirFilter and Path
authroization is implemented through AuthorizationFilter.

5. TestLdapIpDirFilter + TestAuthorizationFilter - IP based test cases retained in TestLdapIpDirFilter
and path test cases are moved to TestAuthorizationFilter

6. ProxyUtil - Added methods to create proxy user and getting namenode url from Hadoop configuration

7. hdfsproxy-default.xml - Including new security related attributes

8. tomcat-web.xml - Adding additional filter for Authroization. Allowing LdapIpDirFilter &
KerberosAuthroizationFilter to be processed for forward and request methods 

9. build.xml - Including TestAuthroizationFilter for cactus based unit tests, Also increasing
verbosity level for logs during build

10. ProxyForwardServlet - Fix for infinite looping by verifying if the context is same as
the current and aborting

11. TestProxyUtil & TestHdfsProxy - Fixes to get the tests to run



> Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
> --------------------------------------------------------------------
>
>                 Key: HDFS-481
>                 URL: https://issues.apache.org/jira/browse/HDFS-481
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: contrib/hdfsproxy
>    Affects Versions: 0.21.0
>            Reporter: zhiyong zhang
>            Assignee: Srikanth Sundarrajan
>         Attachments: HDFS-481-bp-y20.patch, HDFS-481-bp-y20s.patch, HDFS-481.out, HDFS-481.patch,
HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch
>
>
> Bugs:
> 1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
 
> If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's
build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version
will not be recognized. 
> 2. LdapIpDirFilter.java is not thread safe. userName, Group & Paths are per request
and can't be class members.
> 3. Addressed the following StackOverflowError. 
> ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
> ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
> java.lang.StackOverflowError
>         at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
> equest.java:229)
>      This is due to when the target war (/target.war) does not exist, the forwarding
war will forward to its parent context path /, which defines the forwarding war itself. This
cause infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName()
in the if logic to break the loop.
> 4. Kerberos credentials of remote user aren't available. HdfsProxy needs to act on behalf
of the real user to service the requests

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message