hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Wang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HDFS-12398) Use JUnit Paramaterized test suite in TestWriteReadStripedFile
Date Wed, 20 Sep 2017 00:16:00 GMT

    [ https://issues.apache.org/jira/browse/HDFS-12398?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16172531#comment-16172531
] 

Andrew Wang commented on HDFS-12398:
------------------------------------

Hi Huafeng, agree that this doesn't reduce duplication much, my idea was to split it into
multiple classes to avoid the Surefire class-level timeout.

As an example, right now we have a bunch of test methods that look like this:

{code}
  @Test
  public void testFileMoreThanABlockGroup1() throws Exception {
    testOneFileUsingDFSStripedInputStream("/ec/MoreThanABlockGroup1",
        blockSize * dataBlocks + 123);
    testOneFileUsingDFSStripedInputStream("/ec/MoreThanABlockGroup12",
        blockSize * dataBlocks + 123, true);
  }
....
  private void testOneFileUsingDFSStripedInputStream(String src, int fileLength)
      throws Exception {
    testOneFileUsingDFSStripedInputStream(src, fileLength, false);
  }
{code}

We could change it as follows:

{code:title=Parent.java}
  @Test
  public void testFileMoreThanABlockGroup1() throws Exception {
    testOneFileUsingDFSStripedInputStream("/ec/MoreThanABlockGroup1",
        blockSize * dataBlocks + 123);
  }
....
  protected void testOneFileUsingDFSStripedInputStream(String src, int fileLength)
      throws Exception {
    testOneFileUsingDFSStripedInputStream(src, fileLength, false);
  }
{code}

{code:title=Subclass.java}
  @Override
  protected void testOneFileUsingDFSStripedInputStream(String src, int fileLength)
      throws Exception {
    testOneFileUsingDFSStripedInputStream(src, fileLength, true);
  }
{code}

We could combine with JUnit parameterization if you want, seems like Kai had some reservations
though. Personally I'm okay parameterizing; it's harder to run a single test case on the commandline,
but IntelliJ IDEA knows how to do this.

> Use JUnit Paramaterized test suite in TestWriteReadStripedFile
> --------------------------------------------------------------
>
>                 Key: HDFS-12398
>                 URL: https://issues.apache.org/jira/browse/HDFS-12398
>             Project: Hadoop HDFS
>          Issue Type: Improvement
>          Components: test
>            Reporter: Huafeng Wang
>            Assignee: Huafeng Wang
>            Priority: Trivial
>              Labels: flaky-test
>         Attachments: HDFS-12398.001.patch, HDFS-12398.002.patch
>
>
> The TestWriteReadStripedFile is basically doing the full product of file size with data
node failure or not. It's better to use JUnit Paramaterized test suite.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-help@hadoop.apache.org


Mime
View raw message