Return-Path: X-Original-To: apmail-hadoop-common-commits-archive@www.apache.org Delivered-To: apmail-hadoop-common-commits-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 82923FBF9 for ; Sun, 24 Mar 2013 15:56:44 +0000 (UTC) Received: (qmail 94425 invoked by uid 500); 24 Mar 2013 15:56:44 -0000 Delivered-To: apmail-hadoop-common-commits-archive@hadoop.apache.org Received: (qmail 94333 invoked by uid 500); 24 Mar 2013 15:56:44 -0000 Mailing-List: contact common-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-commits@hadoop.apache.org Received: (qmail 94326 invoked by uid 99); 24 Mar 2013 15:56:44 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 24 Mar 2013 15:56:44 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO eris.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 24 Mar 2013 15:56:41 +0000 Received: from eris.apache.org (localhost [127.0.0.1]) by eris.apache.org (Postfix) with ESMTP id C6AB02388978 for ; Sun, 24 Mar 2013 15:56:21 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r1460410 - in /hadoop/common/branches/HDFS-2802: ./ hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/ hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/mapred/ hadoop-tools/hadoop-distcp/src/main/java/org/apac... Date: Sun, 24 Mar 2013 15:56:21 -0000 To: common-commits@hadoop.apache.org From: suresh@apache.org X-Mailer: svnmailer-1.0.8-patched Message-Id: <20130324155621.C6AB02388978@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Author: suresh Date: Sun Mar 24 15:56:18 2013 New Revision: 1460410 URL: http://svn.apache.org/r1460410 Log: Merging trunk to branch HDFS-2802 Modified: hadoop/common/branches/HDFS-2802/ (props changed) hadoop/common/branches/HDFS-2802/BUILDING.txt hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/CopyListing.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCp.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCpConstants.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/SimpleCopyListing.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/mapred/RetriableFileCopyCommand.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/util/ThrottledInputStream.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestCopyListing.java hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java Propchange: hadoop/common/branches/HDFS-2802/ ------------------------------------------------------------------------------ Merged /hadoop/common/trunk:r1457713-1460408 Modified: hadoop/common/branches/HDFS-2802/BUILDING.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/BUILDING.txt?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/BUILDING.txt (original) +++ hadoop/common/branches/HDFS-2802/BUILDING.txt Sun Mar 24 15:56:18 2013 @@ -107,7 +107,7 @@ When you import the project to eclipse, $ cd hadoop-maven-plugins $ mvn install -Then, generate ecplise project files. +Then, generate eclipse project files. $ mvn eclipse:eclipse -DskipTests @@ -147,10 +147,10 @@ Requirements: * Windows System * JDK 1.6 * Maven 3.0 -* Findbugs 1.3.9 (if running findbugs) +* Windows SDK or Visual Studio 2010 Professional * ProtocolBuffer 2.4.1+ (for MapReduce and HDFS) +* Findbugs 1.3.9 (if running findbugs) * Unix command-line tools from GnuWin32 or Cygwin: sh, mkdir, rm, cp, tar, gzip -* Windows SDK or Visual Studio 2010 Professional * Internet connection for first build (to fetch all Maven and Hadoop dependencies) If using Visual Studio, it must be Visual Studio 2010 Professional (not 2012). @@ -185,23 +185,13 @@ set Platform=Win32 (when building on a 3 Several tests require that the user must have the Create Symbolic Links privilege. -All Maven goals are the same as described above, with the addition of profile --Pnative-win to trigger building Windows native components. The native -components are required (not optional) on Windows. For example: - - * Run tests : mvn -Pnative-win test +All Maven goals are the same as described above with the exception that +native code is built by enabling the 'native-win' Maven profile. -Pnative-win +is enabled by default when building on Windows since the native components +are required (not optional) on Windows. ---------------------------------------------------------------------------------- Building distributions: -Create binary distribution with native code and with documentation: - - $ mvn package -Pdist,native-win,docs -DskipTests -Dtar - -Create source distribution: - - $ mvn package -Pnative-win,src -DskipTests - -Create source and binary distributions with native code and documentation: + * Build distribution with native code : mvn package [-Pdist][-Pdocs][-Psrc][-Dtar] - $ mvn package -Pdist,native-win,docs,src -DskipTests -Dtar Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/CopyListing.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/CopyListing.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/CopyListing.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/CopyListing.java Sun Mar 24 15:56:18 2013 @@ -30,6 +30,7 @@ import org.apache.hadoop.tools.util.Dist import org.apache.hadoop.security.Credentials; import java.io.IOException; +import java.lang.reflect.Constructor; /** * The CopyListing abstraction is responsible for how the list of @@ -193,14 +194,34 @@ public abstract class CopyListing extend * @param credentials Credentials object on which the FS delegation tokens are cached * @param options The input Options, to help choose the appropriate CopyListing Implementation. * @return An instance of the appropriate CopyListing implementation. + * @throws java.io.IOException - Exception if any */ public static CopyListing getCopyListing(Configuration configuration, Credentials credentials, - DistCpOptions options) { - if (options.getSourceFileListing() == null) { - return new GlobbedCopyListing(configuration, credentials); - } else { - return new FileBasedCopyListing(configuration, credentials); + DistCpOptions options) + throws IOException { + + String copyListingClassName = configuration.get(DistCpConstants. + CONF_LABEL_COPY_LISTING_CLASS, ""); + Class copyListingClass; + try { + if (! copyListingClassName.isEmpty()) { + copyListingClass = configuration.getClass(DistCpConstants. + CONF_LABEL_COPY_LISTING_CLASS, GlobbedCopyListing.class, + CopyListing.class); + } else { + if (options.getSourceFileListing() == null) { + copyListingClass = GlobbedCopyListing.class; + } else { + copyListingClass = FileBasedCopyListing.class; + } + } + copyListingClassName = copyListingClass.getName(); + Constructor constructor = copyListingClass. + getDeclaredConstructor(Configuration.class, Credentials.class); + return constructor.newInstance(configuration, credentials); + } catch (Exception e) { + throw new IOException("Unable to instantiate " + copyListingClassName, e); } } Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCp.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCp.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCp.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCp.java Sun Mar 24 15:56:18 2013 @@ -319,7 +319,7 @@ public class DistCp extends Configured i * @return Returns the path where the copy listing is created * @throws IOException - If any */ - private Path createInputFileListing(Job job) throws IOException { + protected Path createInputFileListing(Job job) throws IOException { Path fileListingPath = getFileListingPath(); CopyListing copyListing = CopyListing.getCopyListing(job.getConfiguration(), job.getCredentials(), inputOptions); @@ -334,7 +334,7 @@ public class DistCp extends Configured i * @return - Path where the copy listing file has to be saved * @throws IOException - Exception if any */ - private Path getFileListingPath() throws IOException { + protected Path getFileListingPath() throws IOException { String fileListPathStr = metaFolder + "/fileList.seq"; Path path = new Path(fileListPathStr); return new Path(path.toUri().normalize().toString()); Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCpConstants.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCpConstants.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCpConstants.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/DistCpConstants.java Sun Mar 24 15:56:18 2013 @@ -82,6 +82,9 @@ public class DistCpConstants { /* Meta folder where the job's intermediate data is kept */ public static final String CONF_LABEL_META_FOLDER = "distcp.meta.folder"; + /* DistCp CopyListing class override param */ + public static final String CONF_LABEL_COPY_LISTING_CLASS = "distcp.copy.listing.class"; + /** * Conf label for SSL Trust-store location. */ Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/SimpleCopyListing.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/SimpleCopyListing.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/SimpleCopyListing.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/SimpleCopyListing.java Sun Mar 24 15:56:18 2013 @@ -127,17 +127,20 @@ public class SimpleCopyListing extends C if (LOG.isDebugEnabled()) { LOG.debug("Recording source-path: " + sourceStatus.getPath() + " for copy."); } - writeToFileListing(fileListWriter, sourceStatus, sourcePathRoot, localFile); + writeToFileListing(fileListWriter, sourceStatus, sourcePathRoot, + localFile, options); if (isDirectoryAndNotEmpty(sourceFS, sourceStatus)) { if (LOG.isDebugEnabled()) { LOG.debug("Traversing non-empty source dir: " + sourceStatus.getPath()); } - traverseNonEmptyDirectory(fileListWriter, sourceStatus, sourcePathRoot, localFile); + traverseNonEmptyDirectory(fileListWriter, sourceStatus, sourcePathRoot, + localFile, options); } } } else { - writeToFileListing(fileListWriter, rootStatus, sourcePathRoot, localFile); + writeToFileListing(fileListWriter, rootStatus, sourcePathRoot, + localFile, options); } } } finally { @@ -169,6 +172,17 @@ public class SimpleCopyListing extends C } } + /** + * Provide an option to skip copy of a path, Allows for exclusion + * of files such as {@link org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter#SUCCEEDED_FILE_NAME} + * @param path - Path being considered for copy while building the file listing + * @param options - Input options passed during DistCp invocation + * @return - True if the path should be considered for copy, false otherwise + */ + protected boolean shouldCopy(Path path, DistCpOptions options) { + return true; + } + /** {@inheritDoc} */ @Override protected long getBytesToCopy() { @@ -210,7 +224,9 @@ public class SimpleCopyListing extends C private void traverseNonEmptyDirectory(SequenceFile.Writer fileListWriter, FileStatus sourceStatus, - Path sourcePathRoot, boolean localFile) + Path sourcePathRoot, + boolean localFile, + DistCpOptions options) throws IOException { FileSystem sourceFS = sourcePathRoot.getFileSystem(getConf()); Stack pathStack = new Stack(); @@ -221,7 +237,8 @@ public class SimpleCopyListing extends C if (LOG.isDebugEnabled()) LOG.debug("Recording source-path: " + sourceStatus.getPath() + " for copy."); - writeToFileListing(fileListWriter, child, sourcePathRoot, localFile); + writeToFileListing(fileListWriter, child, sourcePathRoot, + localFile, options); if (isDirectoryAndNotEmpty(sourceFS, child)) { if (LOG.isDebugEnabled()) LOG.debug("Traversing non-empty source dir: " @@ -233,8 +250,10 @@ public class SimpleCopyListing extends C } private void writeToFileListing(SequenceFile.Writer fileListWriter, - FileStatus fileStatus, Path sourcePathRoot, - boolean localFile) throws IOException { + FileStatus fileStatus, + Path sourcePathRoot, + boolean localFile, + DistCpOptions options) throws IOException { if (fileStatus.getPath().equals(sourcePathRoot) && fileStatus.isDirectory()) return; // Skip the root-paths. @@ -248,6 +267,10 @@ public class SimpleCopyListing extends C status = getFileStatus(fileStatus); } + if (!shouldCopy(fileStatus.getPath(), options)) { + return; + } + fileListWriter.append(new Text(DistCpUtils.getRelativePath(sourcePathRoot, fileStatus.getPath())), status); fileListWriter.sync(); Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/mapred/RetriableFileCopyCommand.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/mapred/RetriableFileCopyCommand.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/mapred/RetriableFileCopyCommand.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/mapred/RetriableFileCopyCommand.java Sun Mar 24 15:56:18 2013 @@ -124,7 +124,7 @@ public class RetriableFileCopyCommand ex tmpTargetPath, true, BUFFER_SIZE, getReplicationFactor(fileAttributes, sourceFileStatus, targetFS, tmpTargetPath), getBlockSize(fileAttributes, sourceFileStatus, targetFS, tmpTargetPath), context)); - return copyBytes(sourceFileStatus, outStream, BUFFER_SIZE, true, context); + return copyBytes(sourceFileStatus, outStream, BUFFER_SIZE, context); } private void compareFileLengths(FileStatus sourceFileStatus, Path target, @@ -170,8 +170,8 @@ public class RetriableFileCopyCommand ex } private long copyBytes(FileStatus sourceFileStatus, OutputStream outStream, - int bufferSize, boolean mustCloseStream, - Mapper.Context context) throws IOException { + int bufferSize, Mapper.Context context) + throws IOException { Path source = sourceFileStatus.getPath(); byte buf[] = new byte[bufferSize]; ThrottledInputStream inStream = null; @@ -187,8 +187,7 @@ public class RetriableFileCopyCommand ex bytesRead = inStream.read(buf); } } finally { - if (mustCloseStream) - IOUtils.cleanup(LOG, outStream, inStream); + IOUtils.cleanup(LOG, outStream, inStream); } return totalBytesRead; Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/util/ThrottledInputStream.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/util/ThrottledInputStream.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/util/ThrottledInputStream.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/main/java/org/apache/hadoop/tools/util/ThrottledInputStream.java Sun Mar 24 15:56:18 2013 @@ -52,6 +52,11 @@ public class ThrottledInputStream extend this.maxBytesPerSec = maxBytesPerSec; } + @Override + public void close() throws IOException { + rawStream.close(); + } + /** @inheritDoc */ @Override public int read() throws IOException { Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestCopyListing.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestCopyListing.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestCopyListing.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestCopyListing.java Sun Mar 24 15:56:18 2013 @@ -24,6 +24,7 @@ import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter; import org.apache.hadoop.tools.util.TestDistCpUtils; import org.apache.hadoop.hdfs.MiniDFSCluster; import org.apache.hadoop.security.Credentials; @@ -79,7 +80,39 @@ public class TestCopyListing extends Sim return 0; } - @Test + @Test(timeout=10000) + public void testSkipCopy() throws Exception { + SimpleCopyListing listing = new SimpleCopyListing(getConf(), CREDENTIALS) { + @Override + protected boolean shouldCopy(Path path, DistCpOptions options) { + return !path.getName().equals(FileOutputCommitter.SUCCEEDED_FILE_NAME); + } + }; + FileSystem fs = FileSystem.get(getConf()); + List srcPaths = new ArrayList(); + srcPaths.add(new Path("/tmp/in4/1")); + srcPaths.add(new Path("/tmp/in4/2")); + Path target = new Path("/tmp/out4/1"); + TestDistCpUtils.createFile(fs, "/tmp/in4/1/_SUCCESS"); + TestDistCpUtils.createFile(fs, "/tmp/in4/1/file"); + TestDistCpUtils.createFile(fs, "/tmp/in4/2"); + fs.mkdirs(target); + DistCpOptions options = new DistCpOptions(srcPaths, target); + Path listingFile = new Path("/tmp/list4"); + listing.buildListing(listingFile, options); + Assert.assertEquals(listing.getNumberOfPaths(), 2); + SequenceFile.Reader reader = new SequenceFile.Reader(getConf(), + SequenceFile.Reader.file(listingFile)); + FileStatus fileStatus = new FileStatus(); + Text relativePath = new Text(); + Assert.assertTrue(reader.next(relativePath, fileStatus)); + Assert.assertEquals(relativePath.toString(), "/1/file"); + Assert.assertTrue(reader.next(relativePath, fileStatus)); + Assert.assertEquals(relativePath.toString(), "/2"); + Assert.assertFalse(reader.next(relativePath, fileStatus)); + } + + @Test(timeout=10000) public void testMultipleSrcToFile() { FileSystem fs = null; try { @@ -124,7 +157,7 @@ public class TestCopyListing extends Sim } } - @Test + @Test(timeout=10000) public void testDuplicates() { FileSystem fs = null; try { @@ -150,7 +183,7 @@ public class TestCopyListing extends Sim } } - @Test + @Test(timeout=10000) public void testBuildListing() { FileSystem fs = null; try { @@ -206,7 +239,7 @@ public class TestCopyListing extends Sim } } - @Test + @Test(timeout=10000) public void testBuildListingForSingleFile() { FileSystem fs = null; String testRootString = "/singleFileListing"; Modified: hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java?rev=1460410&r1=1460409&r2=1460410&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-tools/hadoop-distcp/src/test/java/org/apache/hadoop/tools/TestIntegration.java Sun Mar 24 15:56:18 2013 @@ -26,6 +26,7 @@ import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.mapreduce.Cluster; import org.apache.hadoop.mapreduce.JobSubmissionFiles; +import org.apache.hadoop.security.Credentials; import org.apache.hadoop.tools.util.TestDistCpUtils; import org.junit.Assert; import org.junit.BeforeClass; @@ -34,6 +35,7 @@ import org.junit.Test; import java.io.IOException; import java.io.OutputStream; import java.util.ArrayList; +import java.util.Arrays; import java.util.List; public class TestIntegration { @@ -68,7 +70,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testSingleFileMissingTarget() { caseSingleFileMissingTarget(false); caseSingleFileMissingTarget(true); @@ -91,7 +93,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testSingleFileTargetFile() { caseSingleFileTargetFile(false); caseSingleFileTargetFile(true); @@ -101,7 +103,7 @@ public class TestIntegration { try { addEntries(listFile, "singlefile1/file1"); - createFiles("singlefile1/file1", target.toString()); + createFiles("singlefile1/file1", "target"); runTest(listFile, target, sync); @@ -114,7 +116,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testSingleFileTargetDir() { caseSingleFileTargetDir(false); caseSingleFileTargetDir(true); @@ -138,7 +140,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testSingleDirTargetMissing() { caseSingleDirTargetMissing(false); caseSingleDirTargetMissing(true); @@ -161,7 +163,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testSingleDirTargetPresent() { try { @@ -180,7 +182,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testUpdateSingleDirTargetPresent() { try { @@ -199,7 +201,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testMultiFileTargetPresent() { caseMultiFileTargetPresent(false); caseMultiFileTargetPresent(true); @@ -223,7 +225,56 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) + public void testCustomCopyListing() { + + try { + addEntries(listFile, "multifile1/file3", "multifile1/file4", "multifile1/file5"); + createFiles("multifile1/file3", "multifile1/file4", "multifile1/file5"); + mkdirs(target.toString()); + + Configuration conf = getConf(); + try { + conf.setClass(DistCpConstants.CONF_LABEL_COPY_LISTING_CLASS, + CustomCopyListing.class, CopyListing.class); + DistCpOptions options = new DistCpOptions(Arrays. + asList(new Path(root + "/" + "multifile1")), target); + options.setSyncFolder(true); + options.setDeleteMissing(false); + options.setOverwrite(false); + try { + new DistCp(conf, options).execute(); + } catch (Exception e) { + LOG.error("Exception encountered ", e); + throw new IOException(e); + } + } finally { + conf.unset(DistCpConstants.CONF_LABEL_COPY_LISTING_CLASS); + } + + checkResult(target, 2, "file4", "file5"); + } catch (IOException e) { + LOG.error("Exception encountered while testing distcp", e); + Assert.fail("distcp failure"); + } finally { + TestDistCpUtils.delete(fs, root); + } + } + + private static class CustomCopyListing extends SimpleCopyListing { + + public CustomCopyListing(Configuration configuration, + Credentials credentials) { + super(configuration, credentials); + } + + @Override + protected boolean shouldCopy(Path path, DistCpOptions options) { + return !path.getName().equals("file3"); + } + } + + @Test(timeout=100000) public void testMultiFileTargetMissing() { caseMultiFileTargetMissing(false); caseMultiFileTargetMissing(true); @@ -246,7 +297,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testMultiDirTargetPresent() { try { @@ -265,7 +316,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testUpdateMultiDirTargetPresent() { try { @@ -284,7 +335,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testMultiDirTargetMissing() { try { @@ -304,7 +355,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testUpdateMultiDirTargetMissing() { try { @@ -323,7 +374,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testDeleteMissingInDestination() { try { @@ -343,7 +394,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testOverwrite() { byte[] contents1 = "contents1".getBytes(); byte[] contents2 = "contents2".getBytes(); @@ -375,7 +426,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testGlobTargetMissingSingleLevel() { try { @@ -398,7 +449,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testUpdateGlobTargetMissingSingleLevel() { try { @@ -420,7 +471,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testGlobTargetMissingMultiLevel() { try { @@ -444,7 +495,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testUpdateGlobTargetMissingMultiLevel() { try { @@ -468,7 +519,7 @@ public class TestIntegration { } } - @Test + @Test(timeout=100000) public void testCleanup() { try { Path sourcePath = new Path("noscheme:///file");