Return-Path: Delivered-To: apmail-hadoop-core-commits-archive@www.apache.org Received: (qmail 41663 invoked from network); 26 Mar 2009 01:42:35 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 26 Mar 2009 01:42:35 -0000 Received: (qmail 74334 invoked by uid 500); 26 Mar 2009 01:42:35 -0000 Delivered-To: apmail-hadoop-core-commits-archive@hadoop.apache.org Received: (qmail 74278 invoked by uid 500); 26 Mar 2009 01:42:35 -0000 Mailing-List: contact core-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-dev@hadoop.apache.org Delivered-To: mailing list core-commits@hadoop.apache.org Received: (qmail 74269 invoked by uid 99); 26 Mar 2009 01:42:35 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 Mar 2009 01:42:35 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO eris.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 Mar 2009 01:42:25 +0000 Received: by eris.apache.org (Postfix, from userid 65534) id 6B94123888EB; Thu, 26 Mar 2009 01:42:05 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r758495 - in /hadoop/core/trunk: ./ src/contrib/hdfsproxy/ src/contrib/hdfsproxy/conf/ src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/ src/contrib/hdfsproxy/src/test/re... Date: Thu, 26 Mar 2009 01:42:01 -0000 To: core-commits@hadoop.apache.org From: cdouglas@apache.org X-Mailer: svnmailer-1.0.8 Message-Id: <20090326014205.6B94123888EB@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Author: cdouglas Date: Thu Mar 26 01:41:57 2009 New Revision: 758495 URL: http://svn.apache.org/viewvc?rev=758495&view=rev Log: HADOOP-5363. Add support for proxying connections to multiple clusters with different versions to hdfsproxy. Contributed by Zhiyong Zhang Added: hadoop/core/trunk/src/contrib/hdfsproxy/conf/tomcat-forward-web.xml hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileForward.java hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyForwardServlet.java hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/SimpleServlet.java hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/TestProxyForwardServlet.java hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config/hdfsproxy-site.xml Modified: hadoop/core/trunk/CHANGES.txt hadoop/core/trunk/src/contrib/hdfsproxy/build.xml hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileDataServlet.java hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFilter.java hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyListPathsServlet.java hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyStreamFile.java hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyUtil.java hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/cactus-web.xml hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HftpFileSystem.java hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HsftpFileSystem.java Modified: hadoop/core/trunk/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/core/trunk/CHANGES.txt?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/CHANGES.txt (original) +++ hadoop/core/trunk/CHANGES.txt Thu Mar 26 01:41:57 2009 @@ -63,6 +63,9 @@ HADOOP-4539. Introduce backup node and checkpoint node. (shv) + HADOOP-5363. Add support for proxying connections to multiple clusters with + different versions to hdfsproxy. (Zhiyong Zhang via cdouglas) + IMPROVEMENTS HADOOP-4565. Added CombineFileInputFormat to use data locality information Modified: hadoop/core/trunk/src/contrib/hdfsproxy/build.xml URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/build.xml?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/build.xml (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/build.xml Thu Mar 26 01:41:57 2009 @@ -18,7 +18,7 @@ --> - + @@ -44,8 +44,7 @@ - - + @@ -62,6 +61,7 @@ + @@ -103,7 +103,6 @@ - @@ -112,7 +111,7 @@ - + Building the .war file @@ -129,9 +128,29 @@ - + + + + Building the forward war file + + + + + + + + + + + + + + + + + @@ -145,6 +164,7 @@ mapping="/ServletRedirectorSecure" roles="test"/> + @@ -159,6 +179,7 @@ mapping="/ServletRedirectorSecure" roles="test"/> + @@ -196,6 +217,7 @@ + @@ -304,7 +326,7 @@ - + @@ -367,22 +389,22 @@ - + - + - + @@ -396,7 +418,7 @@ - + Added: hadoop/core/trunk/src/contrib/hdfsproxy/conf/tomcat-forward-web.xml URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/conf/tomcat-forward-web.xml?rev=758495&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/conf/tomcat-forward-web.xml (added) +++ hadoop/core/trunk/src/contrib/hdfsproxy/conf/tomcat-forward-web.xml Thu Mar 26 01:41:57 2009 @@ -0,0 +1,110 @@ + + + + + + + + + + + HDFS Proxy + + get data from grid forward war + + + + webmaster + zhiyong1@yahoo-inc.com + + The EMAIL address of the administrator to whom questions + and comments about this application should be addressed. + + + + + proxyFilter + org.apache.hadoop.hdfsproxy.ProxyFilter + + filteraddress + 10 + + + + + proxyFilter + /* + + + + + proxyForward + forward data access to specifc servlets + org.apache.hadoop.hdfsproxy.ProxyForwardServlet + + + + proxyForward + /listPaths/* + + + proxyForward + /data/* + + + proxyForward + /streamFile/* + + + + fileForward + forward file data access to streamFile + org.apache.hadoop.hdfsproxy.ProxyFileForward + + + + fileForward + /file/* + + + + + index.html + + + + + + 30 + + + + + + + + + + + + Modified: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileDataServlet.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileDataServlet.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileDataServlet.java (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileDataServlet.java Thu Mar 26 01:41:57 2009 @@ -21,8 +21,12 @@ import java.net.URI; import java.net.URISyntaxException; +import javax.servlet.ServletContext; +import javax.servlet.ServletException; import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.hdfs.protocol.ClientProtocol; import org.apache.hadoop.hdfs.server.namenode.FileDataServlet; @@ -32,6 +36,15 @@ public class ProxyFileDataServlet extends FileDataServlet { /** For java.io.Serializable */ private static final long serialVersionUID = 1L; + + /** {@inheritDoc} */ + @Override + public void init() throws ServletException { + ServletContext context = getServletContext(); + if (context.getAttribute("name.conf") == null) { + context.setAttribute("name.conf", new Configuration()); + } + } /** {@inheritDoc} */ @Override @@ -46,6 +59,8 @@ /** {@inheritDoc} */ @Override protected UnixUserGroupInformation getUGI(HttpServletRequest request) { - return (UnixUserGroupInformation) request.getAttribute("authorized.ugi"); + String userID = (String) request.getAttribute("org.apache.hadoop.hdfsproxy.authorized.userID"); + UnixUserGroupInformation ugi = ProxyUgiManager.getUgiForUser(userID); + return ugi; } } Added: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileForward.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileForward.java?rev=758495&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileForward.java (added) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFileForward.java Thu Mar 26 01:41:57 2009 @@ -0,0 +1,41 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hdfsproxy; + +import javax.servlet.http.HttpServletRequest; + +import org.apache.hadoop.security.UnixUserGroupInformation; + + +public class ProxyFileForward extends ProxyForwardServlet { + /** For java.io.Serializable */ + private static final long serialVersionUID = 1L; + + /** {@inheritDoc} */ + @Override + protected String buildForwardPath(HttpServletRequest request, String pathInfo) { + String path = "/streamFile"; + path += "?filename=" + request.getPathInfo(); + UnixUserGroupInformation ugi = (UnixUserGroupInformation)request.getAttribute("authorized.ugi"); + if (ugi != null) { + path += "&ugi=" + ugi.toString(); + } + return path; + } + +} Modified: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFilter.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFilter.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFilter.java (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyFilter.java Thu Mar 26 01:41:57 2009 @@ -61,14 +61,14 @@ .compile("^(/clearUgiCache)$"); /** Pattern for a filter to find out if a request is HFTP/HSFTP request */ protected static final Pattern HFTP_PATTERN = Pattern - .compile("^(/listPaths|/data|/streamFile)$"); + .compile("^(/listPaths|/data|/streamFile|/file)$"); /** * Pattern for a filter to find out if an HFTP/HSFTP request stores its file * path in the extra path information associated with the URL; if not, the * file path is stored in request parameter "filename" */ protected static final Pattern FILEPATH_PATTERN = Pattern - .compile("^(/listPaths|/data)$"); + .compile("^(/listPaths|/data|/file)$"); private static volatile Map> permsMap; private static volatile Map> certsMap; @@ -88,14 +88,16 @@ Configuration conf = new Configuration(false); conf.addResource("hdfsproxy-default.xml"); conf.addResource("ssl-server.xml"); + conf.addResource("hdfsproxy-site.xml"); String nn = conf.get("hdfsproxy.dfs.namenode.address"); if (nn == null) { throw new ServletException("Proxy source cluster name node address not speficied"); } InetSocketAddress nAddr = NetUtils.createSocketAddr(nn); context.setAttribute("name.node.address", nAddr); - context.setAttribute("name.conf", new Configuration()); - + context.setAttribute("name.conf", new Configuration()); + + context.setAttribute("org.apache.hadoop.hdfsproxy.conf", conf); LOG.info("proxyFilter initialization success: " + nn); } @@ -165,7 +167,7 @@ HttpServletRequest rqst = (HttpServletRequest) request; HttpServletResponse rsp = (HttpServletResponse) response; - + if (LOG.isDebugEnabled()) { StringBuilder b = new StringBuilder("Request from ").append( rqst.getRemoteHost()).append("/").append(rqst.getRemoteAddr()) @@ -264,7 +266,10 @@ userID = userID.substring(3); String servletPath = rqst.getServletPath(); - if (unitTest) servletPath = rqst.getParameter("TestSevletPathInfo"); + if (unitTest) { + servletPath = rqst.getParameter("TestSevletPathInfo"); + LOG.info("this is for unit test purpose only"); + } if (HFTP_PATTERN.matcher(servletPath).matches()) { // request is an HSFTP request @@ -317,11 +322,16 @@ return; } rqst.setAttribute("authorized.ugi", ugi); + rqst.setAttribute("org.apache.hadoop.hdfsproxy.authorized.userID", userID); } else if(rqst.getScheme().equalsIgnoreCase("http")) { // http request, set ugi for servlets, only for testing purposes String ugi = rqst.getParameter("ugi"); if (ugi != null) { rqst.setAttribute("authorized.ugi", new UnixUserGroupInformation(ugi .split(","))); + String[] ugiStr = ugi.split(","); + if(ugiStr.length > 0) { + rqst.setAttribute("org.apache.hadoop.hdfsproxy.authorized.userID", ugiStr[0]); + } } } chain.doFilter(request, response); Added: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyForwardServlet.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyForwardServlet.java?rev=758495&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyForwardServlet.java (added) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyForwardServlet.java Thu Mar 26 01:41:57 2009 @@ -0,0 +1,99 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hdfsproxy; + +import javax.servlet.http.HttpServlet; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import java.io.IOException; +import javax.servlet.ServletException; +import javax.servlet.ServletContext; +import javax.servlet.RequestDispatcher; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.security.UnixUserGroupInformation; + +/** + * + * + */ +public class ProxyForwardServlet extends HttpServlet { + /** + * + */ + private static final long serialVersionUID = 1L; + private static Configuration configuration = null; + public static final Log LOG = LogFactory.getLog(ProxyForwardServlet.class); + + /** {@inheritDoc} */ + @Override + public void init() throws ServletException { + ServletContext context = getServletContext(); + configuration = (Configuration) context.getAttribute("org.apache.hadoop.hdfsproxy.conf"); + } + + /** {@inheritDoc} */ + @Override + public void doGet(HttpServletRequest request, HttpServletResponse response) + throws IOException, ServletException { + String hostname = request.getServerName(); + + String version = configuration.get(hostname); + if (version != null) { + ServletContext curContext = getServletContext(); + ServletContext dstContext = curContext.getContext(version); + + if (dstContext == null) { + LOG.info("Context non-exist or restricted from access: " + version); + response.sendError(HttpServletResponse.SC_NOT_FOUND); + return; + } + LOG.debug("Request to " + hostname + " is forwarded to version " + version); + forwardRequest(request, response, dstContext, request.getServletPath()); + + } else { + LOG.info("not a valid context path"); + response.sendError(HttpServletResponse.SC_NOT_IMPLEMENTED); + } + } + /** {@inheritDoc} */ + public void forwardRequest(HttpServletRequest request, HttpServletResponse response, ServletContext context, String pathInfo) + throws IOException, ServletException{ + String path = buildForwardPath(request, pathInfo); + RequestDispatcher dispatcher = context.getRequestDispatcher(path); + if (dispatcher == null) { + LOG.info("There was no such dispatcher"); + response.sendError(HttpServletResponse.SC_NO_CONTENT); + return; + } + dispatcher.forward(request, response); + } + + /** {@inheritDoc} */ + protected String buildForwardPath(HttpServletRequest request, String pathInfo) { + String path = pathInfo; + if (request.getPathInfo() != null) { + path += request.getPathInfo(); + } + if (request.getQueryString() != null) { + path += "?" + request.getQueryString(); + } + return path; + } +} Modified: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyListPathsServlet.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyListPathsServlet.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyListPathsServlet.java (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyListPathsServlet.java Thu Mar 26 01:41:57 2009 @@ -17,8 +17,11 @@ */ package org.apache.hadoop.hdfsproxy; +import javax.servlet.ServletContext; +import javax.servlet.ServletException; import javax.servlet.http.HttpServletRequest; +import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hdfs.server.namenode.ListPathsServlet; import org.apache.hadoop.security.UnixUserGroupInformation; @@ -26,10 +29,21 @@ public class ProxyListPathsServlet extends ListPathsServlet { /** For java.io.Serializable */ private static final long serialVersionUID = 1L; + + /** {@inheritDoc} */ + @Override + public void init() throws ServletException { + ServletContext context = getServletContext(); + if (context.getAttribute("name.conf") == null) { + context.setAttribute("name.conf", new Configuration()); + } + } /** {@inheritDoc} */ @Override protected UnixUserGroupInformation getUGI(HttpServletRequest request) { - return (UnixUserGroupInformation) request.getAttribute("authorized.ugi"); + String userID = (String) request.getAttribute("org.apache.hadoop.hdfsproxy.authorized.userID"); + UnixUserGroupInformation ugi = ProxyUgiManager.getUgiForUser(userID); + return ugi; } } Modified: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyStreamFile.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyStreamFile.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyStreamFile.java (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyStreamFile.java Thu Mar 26 01:41:57 2009 @@ -21,6 +21,7 @@ import java.net.InetSocketAddress; import javax.servlet.ServletContext; +import javax.servlet.ServletException; import javax.servlet.http.HttpServletRequest; import org.apache.hadoop.hdfs.DFSClient; @@ -32,6 +33,14 @@ public class ProxyStreamFile extends StreamFile { /** For java.io.Serializable */ private static final long serialVersionUID = 1L; + /** {@inheritDoc} */ + @Override + public void init() throws ServletException { + ServletContext context = getServletContext(); + if (context.getAttribute("name.conf") == null) { + context.setAttribute("name.conf", new Configuration()); + } + } /** {@inheritDoc} */ @Override @@ -50,6 +59,8 @@ /** {@inheritDoc} */ @Override protected UnixUserGroupInformation getUGI(HttpServletRequest request) { - return (UnixUserGroupInformation) request.getAttribute("authorized.ugi"); + String userID = (String) request.getAttribute("org.apache.hadoop.hdfsproxy.authorized.userID"); + UnixUserGroupInformation ugi = ProxyUgiManager.getUgiForUser(userID); + return ugi; } } Modified: hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyUtil.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyUtil.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyUtil.java (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/java/org/apache/hadoop/hdfsproxy/ProxyUtil.java Thu Mar 26 01:41:57 2009 @@ -19,6 +19,8 @@ package org.apache.hadoop.hdfsproxy; import java.io.IOException; +import java.io.InputStream; +import java.net.HttpURLConnection; import java.net.InetSocketAddress; import java.net.URI; import java.net.URISyntaxException; @@ -35,6 +37,10 @@ import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FSDataInputStream; +import org.apache.hadoop.fs.FSInputStream; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.io.IOUtils; import org.apache.hadoop.net.NetUtils; import org.apache.hadoop.util.HostsFileReader; @@ -46,7 +52,7 @@ public static final Log LOG = LogFactory.getLog(ProxyUtil.class); private static enum UtilityOption { - RELOAD("-reloadPermFiles"), CLEAR("-clearUgiCache"); + RELOAD("-reloadPermFiles"), CLEAR("-clearUgiCache"), GET("-get"); private String name = null; @@ -163,14 +169,48 @@ } return true; } + + + static FSDataInputStream open(Configuration conf, String hostname, int port, String path) throws IOException { + setupSslProps(conf); + HttpURLConnection connection = null; + connection = openConnection(hostname, port, path); + connection.connect(); + final InputStream in = connection.getInputStream(); + return new FSDataInputStream(new FSInputStream() { + public int read() throws IOException { + return in.read(); + } + public int read(byte[] b, int off, int len) throws IOException { + return in.read(b, off, len); + } + + public void close() throws IOException { + in.close(); + } + + public void seek(long pos) throws IOException { + throw new IOException("Can't seek!"); + } + public long getPos() throws IOException { + throw new IOException("Position unknown!"); + } + public boolean seekToNewSource(long targetPos) throws IOException { + return false; + } + }); + } public static void main(String[] args) throws Exception { - if(args.length != 1 || + if(args.length < 1 || (!UtilityOption.RELOAD.getName().equalsIgnoreCase(args[0]) - && !UtilityOption.CLEAR.getName().equalsIgnoreCase(args[0]))) { + && !UtilityOption.CLEAR.getName().equalsIgnoreCase(args[0]) + && !UtilityOption.GET.getName().equalsIgnoreCase(args[0])) || + (UtilityOption.GET.getName().equalsIgnoreCase(args[0]) && args.length != 4)) { System.err.println("Usage: ProxyUtil [" + UtilityOption.RELOAD.getName() + "] | [" - + UtilityOption.CLEAR.getName() + "]"); + + UtilityOption.CLEAR.getName() + "] | [" + + UtilityOption.GET.getName() + " <#port> ]"); System.exit(0); } Configuration conf = new Configuration(false); @@ -179,10 +219,17 @@ if (UtilityOption.RELOAD.getName().equalsIgnoreCase(args[0])) { // reload user-certs.xml and user-permissions.xml files - boolean error = sendCommand(conf, "/reloadPermFiles"); - } else { + sendCommand(conf, "/reloadPermFiles"); + } else if (UtilityOption.CLEAR.getName().equalsIgnoreCase(args[0])) { // clear UGI caches - boolean error = sendCommand(conf, "/clearUgiCache"); + sendCommand(conf, "/clearUgiCache"); + } else { + String hostname = args[1]; + int port = Integer.parseInt(args[2]); + String path = args[3]; + InputStream in = open(conf, hostname, port, path); + IOUtils.copyBytes(in, System.out, conf, false); + in.close(); } } Added: hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/SimpleServlet.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/SimpleServlet.java?rev=758495&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/SimpleServlet.java (added) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/SimpleServlet.java Thu Mar 26 01:41:57 2009 @@ -0,0 +1,51 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.hdfsproxy; + +import javax.servlet.http.HttpServlet; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; + +import java.io.PrintWriter; +import java.io.IOException; + + +/** + * simple servlet for forward testing purpose + */ + +public class SimpleServlet extends HttpServlet { + + /** + * + */ + private static final long serialVersionUID = 1L; + + public void doGet(HttpServletRequest request, HttpServletResponse response) + throws IOException { + response.setContentType("text/html"); + PrintWriter out = response.getWriter(); + out.print(""); + out.print("A GET request"); + out.print(""); + out.close(); + return; + } + +} Added: hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/TestProxyForwardServlet.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/TestProxyForwardServlet.java?rev=758495&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/TestProxyForwardServlet.java (added) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/test/org/apache/hadoop/hdfsproxy/TestProxyForwardServlet.java Thu Mar 26 01:41:57 2009 @@ -0,0 +1,69 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.hdfsproxy; + +import org.apache.cactus.ServletTestCase; +import org.apache.cactus.WebRequest; +import org.apache.cactus.WebResponse; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; + +import java.io.IOException; +import javax.servlet.ServletException; + +/** Unit tests for ProxyUtil */ +public class TestProxyForwardServlet extends ServletTestCase { + public static final Log LOG = LogFactory.getLog(TestProxyForwardServlet.class); + + + public void beginDoGet(WebRequest theRequest) { + theRequest.setURL("proxy-test:0", null, "/simple", null, null); + } + + public void testDoGet() throws IOException, ServletException { + ProxyForwardServlet servlet = new ProxyForwardServlet(); + + servlet.init(config); + servlet.doGet(request, response); + } + + public void endDoGet(WebResponse theResponse) + throws IOException { + String expected = "A GET request"; + String result = theResponse.getText(); + + assertEquals(expected, result); + } + + + public void testForwardRequest() throws Exception { + ProxyForwardServlet servlet = new ProxyForwardServlet(); + + servlet.forwardRequest(request, response, config.getServletContext(), "/simple"); + } + + public void endForwardRequest(WebResponse theResponse) throws IOException { + String expected = "A GET request"; + String result = theResponse.getText(); + + assertEquals(expected, result); + + } + +} Modified: hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/cactus-web.xml URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/cactus-web.xml?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/cactus-web.xml (original) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/cactus-web.xml Thu Mar 26 01:41:57 2009 @@ -57,10 +57,21 @@ value1 used for testing - + + + Simple + A Simple Servlet + org.apache.hadoop.hdfsproxy.SimpleServlet + + ServletRedirector_TestOverride /ServletRedirectorOverride + + + Simple + /simple/* + Added: hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config/hdfsproxy-site.xml URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config/hdfsproxy-site.xml?rev=758495&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config/hdfsproxy-site.xml (added) +++ hadoop/core/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config/hdfsproxy-site.xml Thu Mar 26 01:41:57 2009 @@ -0,0 +1,15 @@ + + + + + + + + proxy-test + /test + one hostname corresponds to one web application archive + + + + + Modified: hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HftpFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HftpFileSystem.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HftpFileSystem.java (original) +++ hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HftpFileSystem.java Thu Mar 26 01:41:57 2009 @@ -91,26 +91,15 @@ nnAddr = NetUtils.createSocketAddr(name.toString()); } - /** randomly pick one from all available IP addresses of a given hostname */ - protected String pickOneAddress(String hostname) throws UnknownHostException { - if ("localhost".equals(hostname)) - return hostname; - InetAddress[] addrs = InetAddress.getAllByName(hostname); - if (addrs.length > 1) - return addrs[ran.nextInt(addrs.length)].getHostAddress(); - return addrs[0].getHostAddress(); - } @Override public URI getUri() { try { - return new URI("hftp", null, pickOneAddress(nnAddr.getHostName()), nnAddr.getPort(), + return new URI("hftp", null, nnAddr.getHostName(), nnAddr.getPort(), null, null, null); } catch (URISyntaxException e) { return null; - } catch (UnknownHostException e) { - return null; - } + } } /** @@ -121,7 +110,7 @@ protected HttpURLConnection openConnection(String path, String query) throws IOException { try { - final URL url = new URI("http", null, pickOneAddress(nnAddr.getHostName()), + final URL url = new URI("http", null, nnAddr.getHostName(), nnAddr.getPort(), path, query, null).toURL(); if (LOG.isTraceEnabled()) { LOG.trace("url=" + url); Modified: hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HsftpFileSystem.java URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HsftpFileSystem.java?rev=758495&r1=758494&r2=758495&view=diff ============================================================================== --- hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HsftpFileSystem.java (original) +++ hadoop/core/trunk/src/hdfs/org/apache/hadoop/hdfs/HsftpFileSystem.java Thu Mar 26 01:41:57 2009 @@ -69,7 +69,7 @@ protected HttpURLConnection openConnection(String path, String query) throws IOException { try { - final URL url = new URI("https", null, pickOneAddress(nnAddr.getHostName()), + final URL url = new URI("https", null, nnAddr.getHostName(), nnAddr.getPort(), path, query, null).toURL(); HttpsURLConnection conn = (HttpsURLConnection)url.openConnection(); // bypass hostname verification @@ -83,13 +83,11 @@ @Override public URI getUri() { try { - return new URI("hsftp", null, pickOneAddress(nnAddr.getHostName()), nnAddr.getPort(), + return new URI("hsftp", null, nnAddr.getHostName(), nnAddr.getPort(), null, null, null); } catch (URISyntaxException e) { return null; - } catch (UnknownHostException e) { - return null; - } + } } /**