hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Doug Meil <doug.m...@explorysmedical.com>
Subject Re: unit tests
Date Mon, 28 Nov 2011 23:40:00 GMT

The development chapter is in developer.xml, actually, but I think that
adding to the book would be a good idea.



On 11/28/11 6:30 PM, "Jesse Yates" <jesse.k.yates@gmail.com> wrote:

>Alternatively, a good place for this would also be in the book.xml, under
>(13) Building and Developing HBase
>
>-Jesse
>
>On Mon, Nov 28, 2011 at 3:10 PM, Todd Lipcon <todd@cloudera.com> wrote:
>
>> Haven't had a chance to review the full note below, but one request:
>> can we please put this in the code tree itself in a file like
>> TESTING.txt or README.testing.txt or something?
>>
>> It'd be nice to be able to refer back to it from within the code tree.
>>
>> -Todd
>>
>> On Mon, Nov 28, 2011 at 12:29 PM, N Keywal <nkeywal@gmail.com> wrote:
>> > Hi,
>> >
>> > The surefire modification seems to work well, so we can move to the
>>final
>> > step.
>> > Please find below the last proposition for the test. The behavior
>> described
>> > can be activated with:
>> >  - a commit of hbase-4847
>> >  - a modification of the jenkins config to run mvn verify vs. mvn
>>tests.
>> >
>> > The main points are:
>> >  - should we run small & medium tests only by default on a developer
>> > machine => here answer is yes
>> >  - should we run all tests on the central build => here answer is yes
>> >  - should we stop when a small test fails => here answer is yes
>> >  - should we stop when a medium test fails => here answer is yes
>> >
>> >
>> > Also, I just had a discussion with Jesse, who think we should
>>keep/have
>> the
>> > notion of integration test. If retained, the points above must be
>> impacted
>> > to take this into account.
>> >
>> >
>> > fyi, there are today:
>> >  - 416 small tests, executed in ~3 minutes
>> >  - 489 medium tests, executed in ~35 minutes (without parallelization)
>> >  - 280 large tests, executed in ~90 minutes (without parallelization)
>> >
>> > --
>> > 1) Running tests
>> > HBase tests are divided in three categories: small, medium and large,
>> with
>> > corresponding JUnit categories: SmallTests, MediumTests, LargeTests.
>> >
>> > - Small tests are executed in a shared JVM. We put in this category
>>all
>> the
>> > tests that can be executed quickly (the maximum execution time for a
>>test
>> > is 15 seconds, and they do not use a cluster) in a shared jvm.
>> > - Medium tests represents tests that must be executed before
>>proposing a
>> > patch. They are designed to run in less than 30 minutes altogether,
>>and
>> are
>> > quite stable in their results. They're designed to last less than 50
>> > seconds individually. They can use a cluster, and each of them is
>> executed
>> > in a separate JVM.
>> > - Large tests are everything else. They are typically integration
>>tests,
>> > regression tests for specific bugs, timeout tests, performance tests.
>> Some
>> > of them can be flaky. They are executed before a commit on the
>> > pre-integration machines. They can be run on the developer machine as
>> well.
>> >
>> > Commands are:
>> > 1) mvn test
>> >  - execute small tests in a single JVM and medium tests in a separate
>>JVM
>> > for each test
>> >  - medium tests are NOT executed if there is an error in a small test
>> >  - large tests are NOT executed
>> >  - there is one report for small tests, and one report for medium
>>tests
>> -if
>> > they are executed-
>> >
>> > 2) mvn verify
>> >  - execute small tests in a single JVM then medium tests in a separate
>> JVM
>> > for each test, then large tests in a separate JVM as well.
>> >  - medium tests are NOT executed if there is an error in a small test
>> >  - large tests are NOT executed if there is an error in a small or
>>medium
>> > test
>> >  - there is one report by test category, small, medium and large
>> >
>> > 3) mvn test -P localTests -Dtest=myTests
>> >  - remove any category effect (without this specific profile, the
>> profiles
>> > are taken into account)
>> >  - use actually the official release of surefire & the old connector
>>to
>> > junit
>> >  - tests are executed in separated JVM
>> > - You will see a new message at the end of the report: "[INFO] Tests
>>are
>> > skipped". It's harmless.
>> >
>> > 4) mvn test -P runAllTests
>> >  - execute small tests in a single JVM then medium & large tests in a
>> > separate JVM for each test
>> >  - medium and large tests are NOT executed if there is an error in a
>> small
>> > test
>> >  - large tests are NOT executed if there is an error in a small or
>>medium
>> > test
>> >  - there are one report for small tests, and one report for medium &
>> large
>> > tests -if they are executed-
>> >
>> > 5) Various other profiles
>> >  - mvn test -P runSmallTests   - execute small tests only,  in a
>>single
>> JVM.
>> >  - mvn test -P runMediumTests   - execute medium tests in a single
>>JVM.
>> >  - mvn test -P runLargeTests   - execute medium tests in a single JVM.
>> >
>> > It's as well possible to use the script 'hbasetests.sh'. This script
>>runs
>> > the medium and large tests in parallel with two maven instances, and
>> > provide a single report. It must be executed from the directory which
>> > contains the pom.xml. Commands are:
>> > ./dev-support/hbasetests.sh              - execute small and medium
>>tests
>> > ./dev-support/hbasetests.sh runAllTests  - execute all tests
>> > ./dev-support/hbasetests.sh replayFailed - rerun the failed tests a
>> second
>> > time, in a separate jvm and without parallelisation.
>> >
>> > 2) Writing tests
>> > Tests rules & hints are:
>> > - As most as possible, tests should be written as small tests.
>> > - All tests must be written to support parallel execution on the same
>> > machine, hence should not use shared resources as fixed ports or fixed
>> file
>> > names.
>> > - Tests should not overlog. More than 100 lines/second makes the logs
>> > complex to read and use i/o that are hence not available for the other
>> > tests.
>> > - Tests can be written with HBaseTestingUtility . This class offers
>> helper
>> > functions to create a temp directory and do the cleanup, or to start a
>> > cluster.
>> >
>> > - Categories and execution time
>> >  - All tests must be categorized, if not they could be skipped.
>> >  - All tests should be written to be as fast as possible.
>> >  - Small tests should last less than 15 seconds, and must not have any
>> > side effect.
>> >  - Medium tests should last less than 45 seconds.
>> >  - large tests should last less than 3 minutes, this ensure a good
>> > parallelization for the ones using it, and ease the analysis when the
>> test
>> > fails.
>> >
>> > - Sleeps:
>> >    - Whenever possible, tests should not use sleep, but rather waiting
>> for
>> > the real event. This is faster and clearer for the reader.
>> >    - Tests should not do a 'Thread.sleep' without testing an ending
>> > condition. This allows understanding what the test is waiting for.
>> > Moreover, the test will work whatever the machine performances.
>> >    - Sleep should be minimal to be as fast as possible. Waiting for a
>> > variable should be done in a 40ms sleep loop. Waiting for a socket
>> > operation should be done in a 200 ms sleep loop.
>> >
>> > - Tests using  a cluster:
>> >    - Tests using a HRegion do not have to start a cluster: A region
>>can
>> > use the local file system.
>> >    - Start/stopping a cluster cost around 10 seconds. They should not
>>be
>> > started per test method but per test class.
>> >    - Started cluster must be shutdown using
>> > HBaseTestingUtility#shutdownMiniCluster, which cleans the directories.
>> >    - As most as possible, tests should use the default settings for
>>the
>> > cluster. When they don't, they should document it. This will allow to
>> share
>> > the cluster later.
>> >
>>
>>
>>
>> --
>> Todd Lipcon
>> Software Engineer, Cloudera
>>
>
>
>
>-- 
>-------------------
>Jesse Yates
>240-888-2200
>@jesse_yates



Mime
View raw message