hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Srikanth Sundarrajan (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HDFS-481) Bug Fixes
Date Fri, 26 Mar 2010 06:34:29 GMT

     [ https://issues.apache.org/jira/browse/HDFS-481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Srikanth Sundarrajan updated HDFS-481:
--------------------------------------

    Attachment: HDFS-481.out

Patch already includes changes to build.xml for pulling newer tomcat version (to run LdapIpDirFilter
tests)

{noformat} 

@@ -299,7 +301,7 @@
      <containerset>
        <cargo containerId="${tomcat.container.id}" timeout="30000" output="${logs.dir}/output.log"
log="${logs.dir}/cargo.log">
         <zipUrlInstaller
-            installUrl="http://apache.osuosl.org/tomcat/tomcat-6/v6.0.18/bin/apache-tomcat-6.0.18.zip"
+            installUrl="http://apache.osuosl.org/tomcat/tomcat-6/v6.0.24/bin/apache-tomcat-6.0.24.zip"
             installDir="${target.dir}/${tomcat.container.id}"/>
          <configuration type="existing" home="${tomcatconfig.dir}">
            <property name="cargo.servlet.port" value="${cargo.servlet.http.port}"/>

{noformat}

All the contrib tests including (LdapIpDirFilter, AuthorizationFilter) seems to run successfully
with the revised patch (HDFS-481.patch). Attached logs from test-patch and test-contrib runs.

Nicholas, Will exclude white space changes from the patch and re-attach for review.

> Bug Fixes
> ---------
>
>                 Key: HDFS-481
>                 URL: https://issues.apache.org/jira/browse/HDFS-481
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: contrib/hdfsproxy
>    Affects Versions: 0.21.0
>            Reporter: zhiyong zhang
>            Assignee: Srikanth Sundarrajan
>         Attachments: HDFS-481-bp-y20.patch, HDFS-481-bp-y20s.patch, HDFS-481.out, HDFS-481.patch,
HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch
>
>
> 1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
 
> If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's
build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version
will not be recognized. 
> 2. ssl.client.do.not.authenticate.server setting can only be set by hdfs's configuration
files, need to move this setting to ssl-client.xml.
> 3.  Solve some race conditions for LdapIpDirFilter.java. (userId, groupName, and paths
need to be moved to doFilter() instead of as class members
> 4. Addressed the following StackOverflowError. 
> ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
> ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
> java.lang.StackOverflowError
>         at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
> equest.java:229)
>      This is due to when the target war (/target.war) does not exist, the forwarding
war will forward to its parent context path /, which defines the forwarding war itself. This
cause infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName()
in the if logic to break the loop.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message