Return-Path: X-Original-To: apmail-airavata-dev-archive@www.apache.org Delivered-To: apmail-airavata-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id AE85210917 for ; Tue, 17 Dec 2013 13:14:38 +0000 (UTC) Received: (qmail 9478 invoked by uid 500); 17 Dec 2013 13:14:37 -0000 Delivered-To: apmail-airavata-dev-archive@airavata.apache.org Received: (qmail 9334 invoked by uid 500); 17 Dec 2013 13:14:30 -0000 Mailing-List: contact dev-help@airavata.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airavata.apache.org Delivered-To: mailing list dev@airavata.apache.org Received: (qmail 9324 invoked by uid 99); 17 Dec 2013 13:14:28 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Dec 2013 13:14:28 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of thejaka.amila@gmail.com designates 209.85.219.51 as permitted sender) Received: from [209.85.219.51] (HELO mail-oa0-f51.google.com) (209.85.219.51) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Dec 2013 13:14:21 +0000 Received: by mail-oa0-f51.google.com with SMTP id i7so6531919oag.24 for ; Tue, 17 Dec 2013 05:14:00 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=lwMG05u+IJ/3zbqDjTdQNh244/x7Us79WVaAOBlSruQ=; b=CUODddI2Ps8i+omaqwTX0uUBa/JSliIusFvw79LBhwkgZ/GBtenDZaJTtENfjW/7pc kGSwpM9T1+BuOIbFaY2lsGJfHoCkY8BN9RgRlf5hAJ+kqxFu6ODlS9uJX/W3F2O6pQz1 a28ZZWS0vr2gVd8r2KV8TeB2VYaf7qMmlpU3aMU8YBs5L4UftuPIBvyOy9Wnyt/JYm+X VXAnujvvIzroctEyztEFeqxiJ/j2kOjOvbYjIhOMKRA9lnloqqBywB+7SvvhCH1rRUW0 R8xO+ge6ZVDtiC/wgOZE1H2PL8GvcOR127I0jftqg6cbrqx/FXvX8Pg/8dRNlqs5+CtU hB6A== MIME-Version: 1.0 X-Received: by 10.60.92.137 with SMTP id cm9mr15759570oeb.38.1387286040227; Tue, 17 Dec 2013 05:14:00 -0800 (PST) Received: by 10.76.9.40 with HTTP; Tue, 17 Dec 2013 05:14:00 -0800 (PST) In-Reply-To: References: <7A535D89-C433-45D6-B023-3B09878C6703@apache.org> <412BC55A-471E-4937-947A-BB6C06D86F11@apache.org> <4EF20A69-6EDB-4D32-8892-E8727F5F4595@apache.org> Date: Tue, 17 Dec 2013 08:14:00 -0500 Message-ID: Subject: Re: [DISCUSS] Release Methodology From: Amila Jayasekara To: dev Content-Type: multipart/alternative; boundary=047d7b33d81aa1338404edbab3a7 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b33d81aa1338404edbab3a7 Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: quoted-printable This is nice ! -AJ On Tue, Dec 17, 2013 at 3:41 AM, Saminda Wijeratne wrot= e: > To see how it might go I created a simple spreadsheet [1] to record test > results. "0" for untested, positve value for passed tests and negative > value for failed tests. > > I realized it would be overwhelming for a single developer to carryout al= l > the tests so I think its easier to just go on with the usual tests we do > and mark them in the spreadsheet what we covered and later RM (or someone= ) > can figureout a way to carryout tests which was not covered by anyone. > > Let me know if anyone needs edit privileges to the spreadsheet. > > Saminda > > 1. > https://docs.google.com/spreadsheet/ccc?key=3D0AqYI3-ZrFz-EdFI0S3htaWJ1MD= V4RE1WM19Ga0lhbEE&usp=3Dsharing#gid=3D0 > > > On Mon, Dec 16, 2013 at 1:12 PM, Amila Jayasekara > wrote: > >> +1, please. >> >> - Thejaka Amila >> >> >> On Mon, Dec 16, 2013 at 2:05 PM, Suresh Marru wrote: >> >>> On Dec 16, 2013, at 1:51 PM, Saminda Wijeratne >>> wrote: >>> >>> > I was thinking of an actual checklist where we can check-off/vote-off >>> once each test is done. Perhaps we can start with a simple spreadsheet = with >>> the Tests specified by Raman added. >>> >>> + 1. Here is an example from Rave. Template for Quality Assurance [1] >>> and an example [2]. >>> >>> Bottom line, for atleast few days during the release process, we all >>> should become the QA Team. >>> >>> Currently, we are doing scripted testing like 5, 10 minute tutorials an= d >>> grid job submissions and lot of code still does not get touched. As an >>> example, provenance aware search became nonfunctional and until Sanjaya >>> pointed it out, we did not notice it. It will be useful, if randomly (o= r by >>> co-ordination) we all test an RC against various features and then post >>> them to DISCUSS thread. Otherwise, the releases just become pointing to= a >>> tag. We need to move from releases being a formality to every release >>> robusting the code. We have so much active development and if we turn s= ome >>> energy to testing and bug fixing, I think our users will be happy with = the >>> outcome. >>> >>> Suresh >>> [1] - http://wiki.apache.org/rave/ReleaseManagement/QualityAssurance >>> [2] - >>> http://wiki.apache.org/rave/ReleaseManagement/ReleaseSchedule/Verificat= ionResults-0.11 >>> > >>> > >>> > On Mon, Dec 16, 2013 at 10:24 AM, Chathuri Wimalasena < >>> kamalasini@gmail.com> wrote: >>> > There is a general checklist added by Raman [1], which covers basic >>> functionalities. >>> > >>> > Thanks.. >>> > Chathuri >>> > >>> > [1] >>> https://cwiki.apache.org/confluence/display/AIRAVATA/Airavata+Release+T= esting >>> > >>> > >>> > On Mon, Dec 16, 2013 at 12:56 PM, Saminda Wijeratne < >>> samindaw@gmail.com> wrote: >>> > >>> > >>> > >>> > On Mon, Dec 16, 2013 at 9:28 AM, Suresh Marru >>> wrote: >>> > Thanks Amila for weighing in. Comments inline: >>> > >>> > On Dec 16, 2013, at 11:29 AM, Amila Jayasekara < >>> thejaka.amila@gmail.com> wrote: >>> > >>> > > Hi Suresh, >>> > > >>> > > I have some comments inline. >>> > > >>> > > >>> > > On Mon, Dec 16, 2013 at 10:53 AM, Suresh Marru >>> wrote: >>> > > Hi All, >>> > > >>> > > This is a very good question. Lets discuss these options so we are >>> consistent across releases. >>> > > >>> > > If we look at the way we are doing releases, we are calling a >>> feature freeze and code freeze and cutting a release. Most of the time,= our >>> build is broken. Jenkins statistics for Airavata is not looking good = at >>> all [1]. >>> > > >>> > > There is something wrong with the Jenkins configurations. I tried t= o >>> figure out sometime back I was unable to do so. Even though builds are >>> successful in our local machines they are failing intermittently in Jen= kins. >>> > > >>> > > We are barely fixing the build a day before the release, putting ou= t >>> an RC and testing on it and releasing it in a quick succession. >>> > > >>> > > This is not entirely true. For the past few months I only >>> experienced one or two build breaks (maybe less). I build couple of tim= es >>> per week. I believe usually build is stable and with integration tests >>> passing, we always get a workable version. I know its not a good practi= ce >>> not to rely on the build server. But commiters have personal discipline= to >>> keep the build stable. Nevertheless we must fix Jenkins configuration i= ssue. >>> > >>> > May be we should put focus on Jenkins configuration? Any volunteers? >>> > >>> > > >>> > > As we are seeing on user lists, we have users upgrading with every >>> release. I think we should increase the release quality. >>> > > >>> > > +1 for this. >>> > > >>> > > I would vote for atleast 3 RC=92s per release. If we are not findin= g >>> issues in first RC, I would say, either the software has magically beco= me >>> too too good or we are not doing through testing. I suspect the later. >>> > How about we keep a checklist of release tests? I know we already sen= d >>> a mail on dev on what needs to be tested for each RC, but I need that i= s >>> too abstract. For core developers of Airavata I think there should be t= est >>> cases predefined (a test document if you may). Since we have several co= re >>> developers in the list we can atleast decide upon what must be tested a= nd >>> make sure that each test case is covered by atleast one developer for a= RC. >>> > > >>> > > I guess you mentioned this under assumption that build is not stabl= e. >>> > >>> > Half of my assumption is on Jenkins, so if builds are ok and Jenkins >>> is thinking wrong, then we can alleviate it by fixing it. >>> > >>> > > I will propose the following, please counter it and lets agree on a >>> process: >>> > > >>> > > * Lets post a RC1 as is (which means it will have a snapshot). This >>> pack, we should all test as much as possible, so its more of a test >>> candidate then a release candidate. If it helps, we can use the name TC= 1. I >>> am not particular on the naming but trying to emphasize the need for ha= ving >>> atleast more RC's per release. >>> > > >>> > > I am not sure whether we really need a TC. The release manager >>> should be doing some verifications on the RC before putting it out. >>> Therefore it should be a RC. Anyhow i am fine having TC concept and try= ing >>> it out. >>> > >>> > We probably should stick to RC, but I think the onus should not be on >>> the RM to test it. They should coordinate and mobilize every one to do = the >>> testing including doing a testing bit more than others. But my point is= , we >>> should test and the only way to do that is to put a series of RC=92s an= d have >>> focused testing. >>> > A TC should be something internal IMO. But when we are going for a >>> release it should be alpha, beta and then RC releases. I think it need = not >>> be mandatory for the RMs to do pre-evaluation of the builds other than >>> making sure all the unit tests and integration tests pass. Once an RC i= s >>> confirmed of release quality I think we can follow the actual release c= ycle >>> from the trunk itself with since its in a code freeze anyway. >>> > >>> > Suresh >>> > >>> > > >>> > > What we really need is set of verifiable test cases. >>> > > >>> > > Thank you >>> > > Regards >>> > > Amila >>> > > >>> > > >>> > > * If we do not expose significant issues in RC/TC 1 then we proceed >>> with RC2 which will follow the proper release process. But if we have a >>> reasonable issues bought out, we need a RC2/TC2 also without following = the >>> release process. >>> > > >>> > > * The key thing I am proposing is, we keep doing RC/TC=92s until we >>> all are sure the quality is good enough with documented known issues. W= hen >>> we are sure, then we proceed to have RC with proper release process. >>> > > >>> > > So this will mean more testing and twice (or more) the times every >>> one has to test, but I think it is worth it. This might also get over t= he 6 >>> week release cycle, but I think we need to trade for some quality relea= ses >>> as we march towards 1.0. >>> > > >>> > > Suresh >>> > > [1] - https://builds.apache.org/job/Apache%20Airavata/ >>> > > >>> > > >>> > > On Dec 15, 2013, at 4:28 PM, Lahiru Gunathilake >>> wrote: >>> > > >>> > > > >>> > > > Hi Chathuri, >>> > > > >>> > > > I think having snapshot as the version in RC is wrong. Every RC >>> has to be like a release and if it pass we just call a vote/discussion >>> thread and do the release. If we do with snapshot and if things go rig= ht, >>> then have to change versions and test again. But we can do the release = just >>> by changing snapshot without testing but that wrong AFAIT. >>> > > > >>> > > > I remember doing this mistake in earlier release with RC1 build. = I >>> think we can stick to the release management instructions in >>> airavata.org. >>> > > > >>> > > > Regards >>> > > > Lahiru >>> > > > >>> > > > >>> > > > On Fri, Dec 13, 2013 at 3:43 PM, Chathuri Wimalasena < >>> kamalasini@gmail.com> wrote: >>> > > > Hi All, >>> > > > >>> > > > Airavata 0.11 RC1[1] is ready for testing. >>> > > > >>> > > > Here are some pointers for testing >>> > > > =95 Verify the fixed issue for this release [2] >>> > > > =95 Verify the basic workflow composition/execution/monitor= ing >>> scenarios from >>> > > > =95 Airavata 5 & 10 min tutorials [3],[4] >>> > > > =95 Verify airavata client samples >>> > > > =95 Verify the stability with derby & mysql backend databas= es >>> > > > =95 Verify that the XBaya JNLP distribution works >>> > > > =95 Verify deploying Airavata server in a tomcat distributi= on >>> > > > Please report any issues[5] if you encounter while testing. Thank >>> you for your time in validating the release. >>> > > > >>> > > > Regards, >>> > > > Chathuri (On behalf of Airavata PMC) >>> > > > >>> > > > [1] https://dist.apache.org/repos/dist/dev/airavata/0.11/RC1/ >>> > > > [2] >>> https://issues.apache.org/jira/browse/AIRAVATA-278?jql=3Dproject%20%3D%= 20AIRAVATA%20AND%20fixVersion%20%3D%20%220.11%22%20ORDER%20BY%20status%20DE= SC%2C%20priority%20DESC >>> > > > [3] >>> http://airavata.apache.org/documentation/tutorials/airavata-in-5-minute= s.html >>> > > > [4] >>> http://airavata.apache.org/documentation/tutorials/airavata-in-10-minut= es.html >>> > > > [5] https://issues.apache.org/jira/browse/AIRAVATA >>> > > > >>> > > > >>> > > > >>> > > > >>> > > > >>> > > > >>> > > > -- >>> > > > System Analyst Programmer >>> > > > PTI Lab >>> > > > Indiana University >>> > > >>> > > >>> > >>> > >>> > >>> > >>> >>> >> > --047d7b33d81aa1338404edbab3a7 Content-Type: text/html; charset=windows-1252 Content-Transfer-Encoding: quoted-printable
This is nice !

-AJ


On Tue, Dec 17, 2013 a= t 3:41 AM, Saminda Wijeratne <samindaw@gmail.com> wrote:
To see how it mig= ht go I created a simple spreadsheet [1] to record test results. "0&qu= ot; for untested, positve value for passed tests and negative value for fai= led tests.

I realized it would be overwhelming for a single developer to carryout = all the tests so I think its easier to just go on with the usual tests we d= o and mark them in the spreadsheet what we covered and later RM (or someone= ) can figureout a way to carryout tests which was not covered by anyone.
Let me know if anyone needs edit privileges to the spreadsheet.
Saminda


On Mon, Dec = 16, 2013 at 1:12 PM, Amila Jayasekara <thejaka.amila@gmail.com&g= t; wrote:
+1, please.

<= div>- Thejaka Amila


On Mon, Dec 16, 2013 at 2:05 PM, Suresh = Marru <smarru@apache.org> wrote:
On Dec 16, 2013, at 1:51 PM, Saminda Wi= jeratne <saminda= w@gmail.com> wrote:

> I was thinking of an actual checklist where we can check-off/vote-off = once each test is done. Perhaps we can start with a simple spreadsheet with= the Tests specified by Raman added.

+ 1. Here is an example from Rave. Template for Quality Assurance [1]= and an example [2].

Bottom line, for atleast few days during the release process, we all should= become the QA Team.

Currently, we are doing scripted testing like 5, 10 minute tutorials and gr= id job submissions and lot of code still does not get touched. As an exampl= e, provenance aware search became nonfunctional and until Sanjaya pointed i= t out, we did not notice it. It will be useful, if randomly (or by co-ordin= ation) we all test an RC against various features and then post them to DIS= CUSS thread. Otherwise, the releases just become pointing to a tag. We need= to move from releases being a formality to every release robusting the cod= e. We have so much active development and if we turn some energy to testing= and bug fixing, I think our users will be happy with the outcome.

Suresh
[1] - http://wiki.apache.org/rave/ReleaseManagement/Quali= tyAssurance
[2] - http://wiki.apache.org/rave= /ReleaseManagement/ReleaseSchedule/VerificationResults-0.11
>
>
> On Mon, Dec 16, 2013 at 10:24 AM, Chathuri Wimalasena <kamalasini@gmail.com> = wrote:
> There is a general checklist added by Raman [1], which covers basic fu= nctionalities.
>
> Thanks..
> Chathuri
>
> [1] https://cwiki.apache.org/confluen= ce/display/AIRAVATA/Airavata+Release+Testing
>
>
> On Mon, Dec 16, 2013 at 12:56 PM, Saminda Wijeratne <samindaw@gmail.com> wrote:=
>
>
>
> On Mon, Dec 16, 2013 at 9:28 AM, Suresh Marru <smarru@apache.org> wrote:
> Thanks Amila for weighing in. Comments inline:
>
> On Dec 16, 2013, at 11:29 AM, Amila Jayasekara <thejaka.amila@gmail.com> w= rote:
>
> > Hi Suresh,
> >
> > I have some comments inline.
> >
> >
> > On Mon, Dec 16, 2013 at 10:53 AM, Suresh Marru <smarru@apache.org> wrote: > > Hi All,
> >
> > This is a very good question. Lets discuss these options so we ar= e consistent across releases.
> >
> > If we look at the way we are doing releases, we are calling a fea= ture freeze and code freeze and cutting a release. Most of the time, our bu= ild is broken. Jenkins =A0 statistics for Airavata is not looking good at a= ll [1].
> >
> > There is something wrong with the Jenkins configurations. I tried= to figure out sometime back I was unable to do so. Even though builds are = successful in our local machines they are failing intermittently in Jenkins= .
> >
> > We are barely fixing the build a day before the release, putting = out an RC and testing on it and releasing it in a quick succession.
> >
> > This is not entirely true. For the past few months I only experie= nced one or two build breaks (maybe less). I build couple of times per week= . I believe usually build is stable and with integration tests passing, we = always get a workable version. I know its not a good practice not to rely o= n the build server. But commiters have personal discipline to keep the buil= d stable. Nevertheless we must fix Jenkins configuration issue.
>
> May be we should put focus on Jenkins configuration? Any volunteers? >
> >
> > As we are seeing on user lists, we have users upgrading with ever= y release. I think we should increase the release quality.
> >
> > +1 for this.
> >
> > I would vote for atleast 3 RC=92s per release. If we are not find= ing issues in first RC, I would say, either the software has magically beco= me too too good or we are not doing through testing. I suspect the later. > How about we keep a checklist of release tests? I know we already send= a mail on dev on what needs to be tested for each RC, but I need that is t= oo abstract. For core developers of Airavata I think there should be test c= ases predefined (a test document if you may). Since we have several core de= velopers in the list we can atleast decide upon what must be tested and mak= e sure that each test case is covered by atleast one developer for a RC. > >
> > I guess you mentioned this under assumption that build is not sta= ble.
>
> Half of my assumption is on Jenkins, so if builds are ok and Jenkins i= s thinking wrong, then we can alleviate it by fixing it.
>
> > I will propose the following, please counter it and lets agree on= a process:
> >
> > * Lets post a RC1 as is (which means it will have a snapshot). Th= is pack, we should all test as much as possible, so its more of a test cand= idate then a release candidate. If it helps, we can use the name TC1. I am = not particular on the naming but trying to emphasize the need for having at= least more RC's per release.
> >
> > I am not sure whether we really need a TC. The release manager sh= ould be doing some verifications on the RC before putting it out. Therefore= it should be a RC. Anyhow i am fine having TC concept and trying it out. >
> We probably should stick to RC, but I think the onus should not be on = the RM to test it. They should coordinate and mobilize every one to do the = testing including doing a testing bit more than others. But my point is, we= should test and the only way to do that is to put a series of RC=92s and h= ave focused testing.
> A TC should be something internal IMO. But when we are going for a rel= ease it should be alpha, beta and then RC releases. I think it need not be = mandatory for the RMs to do pre-evaluation of the builds other than making = sure all the unit tests and integration tests pass. Once an RC is confirmed= of release quality I think we can follow the actual release cycle from the= trunk itself with since its in a code freeze anyway.
>
> Suresh
>
> >
> > What we really need is set of verifiable test cases.
> >
> > Thank you
> > Regards
> > Amila
> >
> >
> > * If we do not expose significant issues in RC/TC 1 then we proce= ed with RC2 which will follow the proper release process. But if we have a = reasonable issues bought out, we need a RC2/TC2 also without following the = release process.
> >
> > * The key thing I am proposing is, we keep doing RC/TC=92s until = we all are sure the quality is good enough with documented known issues. Wh= en we are sure, then we proceed to have RC with proper release process.
> >
> > So this will mean more testing and twice (or more) the times ever= y one has to test, but I think it is worth it. This might also get over the= 6 week release cycle, but I think we need to trade for some quality releas= es as we march towards 1.0.
> >
> > Suresh
> > [1] - https://builds.apache.org/job/Apache%20Airavata/ > >
> >
> > On Dec 15, 2013, at 4:28 PM, Lahiru Gunathilake <glahiru@gmail.com> wrote:<= br> > >
> > >
> > > Hi Chathuri,
> > >
> > > I think having snapshot as the version in RC is wrong. Every= RC has to be like a release and if it pass we just call a vote/discussion = thread and do the release. If we do with snapshot =A0and if things go right= , then have to change versions and test again. But we can do the release ju= st by changing snapshot without testing but that wrong AFAIT.
> > >
> > > I remember doing this mistake in earlier release with RC1 bu= ild. I think we can stick to the release management instructions in airavata.org.
> > >
> > > Regards
> > > Lahiru
> > >
> > >
> > > On Fri, Dec 13, 2013 at 3:43 PM, Chathuri Wimalasena <kamalasini@gmail.com= > wrote:
> > > Hi All,
> > >
> > > Airavata 0.11 RC1[1] is ready for testing.
> > >
> > > Here are some pointers for testing
> > > =A0 =A0 =A0 =95 Verify the fixed issue for this release [2]<= br> > > > =A0 =A0 =A0 =95 Verify the basic workflow composition/execut= ion/monitoring scenarios from
> > > =A0 =A0 =A0 =95 Airavata 5 & 10 min tutorials [3],[4] > > > =A0 =A0 =A0 =95 Verify airavata client samples
> > > =A0 =A0 =A0 =95 Verify the stability with derby & mysql = backend databases
> > > =A0 =A0 =A0 =95 Verify that the XBaya JNLP distribution work= s
> > > =A0 =A0 =A0 =95 Verify deploying Airavata server in a tomcat= distribution
> > > Please report any issues[5] if you encounter while testing. = Thank you for your time in validating the release.
> > >
> > > Regards,
> > > Chathuri (On behalf of Airavata PMC)
> > >
> > > [1] https://dist.apache.org/repos/dist/dev/aira= vata/0.11/RC1/
> > > [2] https:= //issues.apache.org/jira/browse/AIRAVATA-278?jql=3Dproject%20%3D%20AIRAVATA= %20AND%20fixVersion%20%3D%20%220.11%22%20ORDER%20BY%20status%20DESC%2C%20pr= iority%20DESC
> > > [3] http://airavata.apache.= org/documentation/tutorials/airavata-in-5-minutes.html
> > > [4] http://airavata.apache= .org/documentation/tutorials/airavata-in-10-minutes.html
> > > [5] https://issues.apache.org/jira/browse/AIRAVATA
> > >
> > >
> > >
> > >
> > >
> > >
> > > --
> > > System Analyst Programmer
> > > PTI Lab
> > > Indiana University
> >
> >
>
>
>
>




--047d7b33d81aa1338404edbab3a7--