cloudstack-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ron Wheeler <rwhee...@artifact-software.com>
Subject Re: Call for participation: Issue triaging and PR review/testing
Date Thu, 14 Dec 2017 19:52:47 GMT
Are there scripts for manual testing?

If there are scripts, these could certainly be done by 
non-developers/sysadmins as long as they have a test bed to use.

The scripts certainly would be a "good thing" to have for acceptance 
testing for anyone planning to put Cloudstack onto production for 
themselves or a client.

What is the minimal hardware configuration required to test the UI and 
user level functionality?
Do we have instructions to create a minimal test station? Is it more 
than two old desktops and a hub?
Need additional hardware to test specific routers and networks but 
shouldn't organizations wanting to put a new version into prosuction 
already have test/spare equipment.

What are the key UI/system features that are best tested by humans with 
a script
  - Clearly documentation and installation instructions are high on this 
list.

Ron

On 14/12/2017 2:15 PM, Luis wrote:
> Manual testing, not sure if automated test could do an entire test
>
> Sent from Yahoo Mail on Android
>   
>    On Thu, Dec 14, 2017 at 11:25 AM, Paul Angus<paul.angus@shapeblue.com> wrote:
  Hi Luis,
>
> Can you explain what you mean please?  Do you mean people writing automated tests or
manual testing of discrete features?
>
>
>
> Kind regards,
>
> Paul Angus
>
> paul.angus@shapeblue.com
> www.shapeblue.com
> 53 Chandos Place, Covent Garden, London  WC2N 4HSUK
> @shapeblue
>    
>   
>
>
> -----Original Message-----
> From: Luis [mailto:lmartinez073@yahoo.com.INVALID]
> Sent: 14 December 2017 02:04
> To: users@cloudstack.apache.org; Ivan Kudryavtsev <kudryavtsev_ia@bw-sw.com>; users@cloudstack.apache.org
> Cc: dev <dev@cloudstack.apache.org>
> Subject: Re: Call for participation: Issue triaging and PR review/testing
>
> Hi
> What about creating a team for testing and create a check list of what to test and how.
Besides the people that uses CS. This may increase the quality.
> Just an idea.
>
> Sent from Yahoo Mail on Android
>   
>    On Wed, Dec 13, 2017 at 10:21 AM, Ivan Kudryavtsev<kudryavtsev_ia@bw-sw.com>
wrote:  Hi, Paul. Thank you for your response. I just still feel that it's a very
> risky approach to deliver a new release if community haven't adopted and tried a previous
one because future unidentified regressions are multiplied to currently unidentified regressions.
But, I see it's a trade and controversity here.
>
> 2017-12-13 21:46 GMT+07:00 Paul Angus <paul.angus@shapeblue.com>:
>
>> Thanks Rene.
>>
>> @Ivan, I understand your concerns. But if 4.10 is unusable, then it
>> will never get much production testing.
>> The longer between releases, the harder testing and triage becomes.
>>
>> By putting a line in the sand for 4.11 and 4.12, and with the desire
>> to  keep making every release better than the last we can keep moving forward.
>>    I think we're all largely in agreement that the process around 4.10
>> was  sub-optimal, which is why we've set out clear guidelines that we'd
>> like to  work to.
>>
>> You are correct that there is more to quality than just Marvin tests
>> (or at least the current ones), and long term, if community members
>> like yourselves and Rene, come up with tests/test structures that push
>> the boundaries of CloudStack, then automated testing will only get better.
>>
>> For now though, I would suggest that the best way to galvanise the
>> community around the manual testing of CloudStack is to have a release
>> candidate that everyone can coalesce around.
>>
>>
>>
>> Kind regards,
>>
>> Paul Angus
>>
>> paul.angus@shapeblue.com
>> www.shapeblue.com
>> 53 Chandos Place, Covent Garden, London  WC2N 4HSUK @shapeblue
>>
>>
>>
>>
>> -----Original Message-----
>> From: Rene Moser [mailto:mail@renemoser.net]
>> Sent: 13 December 2017 12:56
>> To: dev <dev@cloudstack.apache.org>; users@cloudstack.apache.org
>> Subject: Re: Call for participation: Issue triaging and PR review/testing
>>
>> Hi all
>>
>> On 12/13/2017 05:04 AM, Ivan Kudryavtsev wrote:
>>> Hello, devs, users, Rohit. Have a good day.
>>>
>>> Rohit, you intend to freeze 4.11 on 8 january and, frankly speaking, I
>>> see risks here. A major risk is that 4.10 is too buggy and it seems
>>> nobody uses it actually right now in production because it's unusable,
>>> unfortunately, so we are planning to freeze 4.11 which stands on
>>> untested 4.10 with a lot of lacks still undiscovered and not reported.
>>> I believe it's a very dangerous way to release one more release with
>>> bad quality. Actually, marvin and units don't cover regressions I meet
>>> in 4.10. Ok, let's take a look at new one our engineers found today in
>> 4.10:
>>
>> So, the point is, how do we (users, devs, all) improve quality?
>>
>> Marvin is great for smoke testing but CloudStack is dealing with many
>> infra vendor components, which are not covered by the tests. How can we
>> detect flows not covered by marvin?
>>
>> For me, I decided (independent of this discussion) to write integration
>> tests in a way one would not expect, not following the "happy path":
>>
>> Try to break CloudStack, to make a better CloudStack.
>>
>> Put a chaos monkey in your test infra: Shut down storage, kill a host, put
>> latency on storage, disable network on hosts, make load on a host.
>> read only fs on a cluster wide primary fs. shut down a VR, remove a VR.
>>
>> Things that can happen!
>>
>> Not surprisingly I use Ansible. It has an extensive amount of modules
>> which can be used to battle prove anything of your infra. Ansible playbooks
>> are fairly easy to write, even when you are not used to write code.
>>
>> I will share my works when ready.
>>
>> René
>>
>>
>>
>>
>>
>>
>

-- 
Ron Wheeler
President
Artifact Software Inc
email: rwheeler@artifact-software.com
skype: ronaldmwheeler
phone: 866-970-2435, ext 102


Mime
View raw message