hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-14971) Merge S3A committers into trunk
Date Fri, 17 Nov 2017 23:10:00 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-14971?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16257727#comment-16257727
] 

ASF GitHub Bot commented on HADOOP-14971:
-----------------------------------------

Github user ajfabbri commented on a diff in the pull request:

    https://github.com/apache/hadoop/pull/282#discussion_r151810215
  
    --- Diff: hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/commit/AbstractITCommitProtocol.java
---
    @@ -1257,6 +1293,74 @@ public void testAMWorkflow() throws Throwable {
         assertNoMultipartUploadsPending(getOutDir());
       }
     
    +
    +  @Test
    +  public void testParallelJobsToAdjacentPaths() throws Throwable {
    +
    +    describe("Run two jobs in parallel, assert they both complete");
    +    JobData jobData = startJob(true);
    +    Job job1 = jobData.job;
    +    AbstractS3ACommitter committer1 = jobData.committer;
    +    JobContext jContext1 = jobData.jContext;
    +    TaskAttemptContext tContext1 = jobData.tContext;
    +
    +    // now build up a second job
    +    String jobId2 = randomJobId();
    +    String attempt20 = "attempt_" + jobId2 + "_m_000000_0";
    +    TaskAttemptID taskAttempt20 = TaskAttemptID.forName(attempt20);
    +    String attempt21 = "attempt_" + jobId2 + "_m_000001_0";
    +    TaskAttemptID taskAttempt21 = TaskAttemptID.forName(attempt21);
    +
    +    Path job1Dest = outDir;
    +    Path job2Dest = new Path(getOutDir().getParent(),
    +        getMethodName() + "job2Dest");
    +    // little safety check
    +    assertNotEquals(job1Dest, job2Dest);
    +
    +    // create the second job
    +    Job job2 = newJob(job2Dest, new JobConf(getConfiguration()), attempt20);
    +    Configuration conf2 = job2.getConfiguration();
    +    conf2.setInt(MRJobConfig.APPLICATION_ATTEMPT_ID, 1);
    +    try {
    +      JobContext jContext2 = new JobContextImpl(conf2,
    +          taskAttempt20.getJobID());
    +      TaskAttemptContext tContext2 =
    +          new TaskAttemptContextImpl(conf2, taskAttempt20);
    +      AbstractS3ACommitter committer2 = createCommitter(job2Dest, tContext2);
    +      JobData jobData2 = new JobData(job2, jContext2, tContext2, committer2);
    +      setup(jobData2);
    +      abortInTeardown(jobData2);
    +      // make sure the directories are different
    +      assertEquals(job2Dest, committer2.getOutputPath());
    +
    +      // job2 setup, write some data there
    +      writeTextOutput(tContext2);
    +
    +      // at this point, job1 and job2 both have uncommitted tasks
    +
    +      // commit tasks in order task 2, task 1.
    +      committer2.commitTask(tContext2);
    +      committer1.commitTask(tContext1);
    +
    +      assertMultipartUploadsPending(job1Dest);
    +      assertMultipartUploadsPending(job2Dest);
    +
    +      // commit jobs in order job 1, job 2
    +      committer1.commitJob(jContext1);
    +      assertNoMultipartUploadsPending(job1Dest);
    +      getPart0000(job1Dest);
    +      assertMultipartUploadsPending(job2Dest);
    +
    +      committer2.commitJob(jContext2);
    +      getPart0000(job2Dest);
    +      assertNoMultipartUploadsPending(job2Dest);
    +    } finally {
    +      // uncommitted files to this path need to be deleted in tests which fail
    +      abortMultipartUploadsUnderPath(job2Dest);
    +    }
    +
    +  }
    +
       protected void validateTaskAttemptPathDuringWrite(Path p) throws IOException {
     
       }
    --- End diff --
    
    Good catch on this case, and thanks for adding a test to cover it.


> Merge S3A committers into trunk
> -------------------------------
>
>                 Key: HADOOP-14971
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14971
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.0.0
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>         Attachments: HADOOP-13786-040.patch, HADOOP-13786-041.patch
>
>
> Merge the HADOOP-13786 committer into trunk. This branch is being set up as a github
PR for review there & to keep it out the mailboxes of the watchers on the main JIRA



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


Mime
View raw message