db-derby-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shreyas Kaushik <Shreyas.Kaus...@Sun.COM>
Subject Re: Adding new documentation for debugging test failures in the test framework
Date Thu, 28 Apr 2005 12:50:15 GMT
Thanks for your comments. I will take care of all the things you have 
mentioned here.

Myrna van Lunteren wrote:

>Kathey wrote: 
>>Shreyas would you mind generating the html from  forrest and sending
>>that? It is hard to review the xml without the  forrest build setup and
>>haven't had a chance to get that set up yet.
>I finally got around to getting to look at this myself...and I
>couldn't deal with reading this in xml (my browser just displayed the
>source). So, I loaded the thang in my old forrest 5 install & I'm
>attaching the html for Kathey. :-)
>Of course none of the tab-stuff works, but it makes for easier reading
>of the doc.
>And here are my 2 cents as to the contents:
>It appears that mostly Kathey's email was just put into the xml
Kathey had pretty much of the stuff covered here, atleast for me what 
ever information Kathey provided was mor than sufficient. Also Kathey 
put it down
in a very good format which I thought would be the righ way to present it.

~ Shreyas

>Some formatting changes would definitely help readability.
>Anyways, it's great to see this doc. 
>Here's some specific changes I would suggest:
>- typos:
>  - Note - framewrok instead of framework
>  -  as Jack said, on 1 line of steps to be followed - Frist instead
>of First. Fro instead of for
>  -  line 3: netwrok server instead of network server
>- other: 
>   - I suggest a carriage return after each first sentence for each
>step. I think it's easier readable. Just a suggestion, though.
>   -  I would suggest not to have mytest in quotes - or mention that
>the quotes are just to indicate it's not a real test. ok, this is
>probably obvious, but it struck me as being incorrect syntax when I
>first read the doc.
>   - change the line for network server to include derbyclient like so:
>       "For the netwrok server you would need to add a framework
>property -Dframework=DerbyNetClient (or -Dframework=DerbyNet if you're
>testing the IBM Universal Driver - "jcc")
>   - section 4: split out the a, b, c so they're below each other.
>again just for readability
>   - section 4 contains the suggestion to keep a clean client for comparison.
>     I think this would be better to put right after 'The old
>behavior'. (a). It's a really valuable step, to go back to what
>happens without your change. Also, I don't think 'client' is
>appropriate with svn, I suggest: 'setup' instead.
>  - when it seems the old behavior was incorrect, it may be valuable
>to see why that piece of code was put in & what related changes went
>into the test. To do this, use svn log <filename>. To see all changes
>to (other) files that went in for a particular revision, use: svn log
>-v -r <revisionnumber>.
>   (Of course, it could be erroneous accepted behavior stems from
>before contribution to Apache. In that case, svn won't help & you'll
>have to post to the list - possibly an IBM employee can backtrack, or
>folks on the list can confirm the behavior is not ok.)
>  - section 5: again, split up the a, b, c
>  - section 6: ditto.
>  - as Jack said, derby.log is also a good source for analyzing
>trouble. If you've found the trouble code in the test, and made a
>small subset (e.g. a tiny .sql that you can run in ij, or a small java
>program that can be debugged easily) it will help to add at least the
>following properties to a (to be newly created) derby.properties file:
>      derby.infolog.append=true
>      derby.language.logStatementText=true
>      derby.stream.error.logSeverityLevel=0
>   I suggest putting something about this in section 4. 
>  - I suggest adding a bit about debugging the tests in an IDE - for
>example eclipse, something like this:
>"If you can't easily reproduce or analyze the problem within a small
>ij case or program, you could try debugging the tests in - for example
>- eclipse. To do so, you need to add the following to the vm arguments
>for org.apache.derbyTesting.functionTests.harness.RunTest (your main
>     -Dclasspath=<mytrunk>/classes;<mytrunk>/tools/java/jakarta-oro-2.0.8.jar
> -Duser.dir=c:/testdir -Duseprocess=false
>   Without useprocess=false, the test will kick off a new jvm process
>to actually run the test and you cannot debug those threads.
>  But debugging a small test case is much better"
>- step 6: somewhere in here should go that you should report the
>environment you ran in, especially. jvm version and os.
>Thx for making this doc!
> ------------------------------------------------------------------------
> MyGroup <http://mygroup.org> 	MyProject <http://myproj.mygroup.org/> 	
> the MyProject site 	
> 	Home <../index.html> 	
> 	*Samples <../samples/sample.html>* 	
>     * Samples
>           o Apache document page <../samples/sample.html>
>           o Static content <../samples/sample2.html>
>           o debugtest content
>           o Wiki page <../samples/wiki-sample.html>
>           o ihtml page <../samples/ihtml-sample.html>
>           o derbytesting html page <../samples/README.html>
>           o derbytesting page <../samples/derbyTesting.html>
>           o ehtml page <../samples/ehtml-sample.html>
>           o FAQ <../samples/faq.html>
>           o Simplifed Docbook page <../samples/sdocbook.html>
>           o Subdir
>                 + Index <../samples/subdir/index.html>
>   Debugging test failures with the Derby test framework
> 	PDF
> PDF <debugtest.pdf>
> This document gives details of how to debug test failures. This is 
> targeted at developers who contribute code to Derby and would want to 
> investigate test failures caused by their fixes. lease post questions, 
> comments, and corrections to derby-dev@db.apache.org.
>     * Introduction <#introduction>
>     * Steps to be followed <#Steps+to+be+followed>
>       Introduction
> Note
> The contents in this document are mostly inputs I received from Kathey 
> Marsden
> The Derby codebase has a slightly complicated test framework suite. 
> Although using the framewrok to run tests is very simple and the 
> framework itself give extensive results of the tests that passed and 
> failed, it does get really tough to debug these test failures. The 
> following sections give a step by step insight into debugging test 
> failure.
>       Steps to be followed
> 1. Frist the test/s have to be run. The details fro running the tests 
> can be found at ${derby.source}/java/testing/README.htm. The command 
> for running the test/s would something like this,
>          java  -Dij.exceptionTrace=true -Dkeepfiles=true org.apache.derbyTesting.functionTests.harness.RunTest
> For the netwrok server you would need to add the following property 
> -Dframework=DerbyNet
> 2. Do a visual diff of the ouptut with the canon. It will give you 
> more context in approaching the problem. In the test output directory 
> you will see "mytest".out (filitered test output) and "mytest".tmp 
> (unfiltered test output - this one should have a trace). To get the 
> most information, it is advised to diff the tmp file with the canon 
> which is checked in under 
> java/testing/org/apache/derbyTesting/functionTests/master/ or 
> appropriate framework or jdk subdirectory.
> 3. Identify the sql statement or java code causing the diff. For sql 
> scripts this is usually pretty obvious. For java programs you have to 
> look at the test source to figure out what is going on.
> 4. Evaluate the diff. Here of course starts the tricky and interesting 
> part. Look carefully at the sql command or java code that caused the 
> diff. Think about what it should do and how it relates to your change. 
> Decide what you think the right behaviour should be. a) The old 
> behaviour b) The new behaviour c) something else If you have a trace, 
> look at the the trace and see if it holds any clue. If you are lucky 
> it passes right through that code you changed. Often it is helpful to 
> put that one sql command or bit of java code in a separate script or 
> stand alone program, so you can evaluate it independently outside of 
> the harness and evaluate in the debugger. Common results of the 
> evaluation phase are: a) An error in your code. b) Someone elses test 
> failure all together. It is good to keep a clean client for testing 
> this. c) A master update. Be careful with this one. make sure you have 
> a valid reason to update a master. Be especially cognizant of backward 
> compatibility. We don't want any real or perceived regressions to 
> catch us by surprise.
> Note
> Two good questions to ask your self and then answer when you post your 
> patch. 1) What is my valid reason for updating this master. 2) Might 
> someone be surprised by this difference and perceive it as a regression?
> 5. Resolve the issue. Here are some possible actions based on your 
> evaluation in step 4 a) An error in your code. Go fix it !!! b) 
> Someone else's test failure all together. Look at the recent checkins 
> and try to guess the culprit and post. c) A master update. Update the 
> master and make sure you include an explanation in your patch.
> 6. Reporting your findings. If you get stuck along the way, please 
> post to the list, but make sure you include. a) The small bit of sql 
> or java code in question. b) A description of the old and new 
> behaviour and what you think the right behavior should be. c) The 
> stack trace if there is one. d) What you have learned so far from your 
> own evaluation of 1-3 above. e) A specific question that is not going 
> to take a lot of research for someone to answer that you think might 
> send you back along your way if answered.
> Copyright © 2003 The Acme Software Foundation. All rights reserved.

View raw message