From Fedora Project Wiki

< User:Roshi

Revision as of 19:42, 17 September 2014 by Roshi (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


This is a collection of docs related to testing of the Cloud product. As these get solidified they'll go in more official places than my wikispace. Currently on the tick-list:

  • Identify Relevant Release Criteria
  • Draft a Test Plan
  • Draft a Test Matrix for each Milestone

Relevant Release Criteria

This is not meant to replace the existing RC pages. It was just easier for me to parse after taking out all the parts that don't directly deal with Cloud images. All release criteria referenced in the test plan will correlate to the actual Fedora 21 Fedora Release Criteria.

Test Overview

Test Plan

Test Matrix

Traditionally we've had a couple validation matrices for a particular release (Base, Installation and Desktop). However, for cloud I think a single matrix should suffice for our testing needs. It's basically a fork of the QA:Base_validation_results_template with cloud testcases and milestones added in.

Known Testcases

This is the list of testcases that I've found related to cloud images. Some of these need cleaning and assessed for if they can be automated.




Atomic and Docker images won't be release blocking for F21. However, the hope is to have them be proper products for F22. Here is where we'll document the process to use when F22 gets here. I see this happening in 3 phases: Plan/Document, Draft, Integrate.


First is to figure out what things need to be tested for Atomic and Docker images. We also have to look at the existing Release Criteria and draft any changes they might need in order to accommodate Atomic and Docker images. Use the two pages below to add ideas for testcases.


Once the needed tests are decided on, we need to write the testcases and Release Criteria out here on the wiki. From here Atomic/Cloud/QA can discuss the proposed changes. After we have testcases written, cross-referenced with the Release Criteria and discussed, work can start on automating the created test cases.


The last step is to work with QA and figure out the best means to support the added changes for F22. Something that will need to be hashed out is manpower (who will be doing the testing, who all shows up to blocker review meetings), and I'm sure other things I haven't thought of yet.