From Fedora Project Wiki

(let's just have one nightly category, doesn't make sense to split these at branched)
 
(One intermediate revision by one other user not shown)
Line 82: Line 82:
| [[QA:Testcase_base_system_logging]]
| [[QA:Testcase_base_system_logging]]
| {{result|none}}
| {{result|none}}
| {{result|pass|pwhalen}}20140606-Minimal
| {{result|pass|pwhalen}}20140606-Minimal{{result|pass|pwhalen}}20140623-Minimal
| {{result|none}}
| {{result|none}}
| <references/>  
| <references/>  
Line 89: Line 89:
| [[QA:Testcase_Services_start]]
| [[QA:Testcase_Services_start]]
| {{result|none}}
| {{result|none}}
| {{result|pass|pwhalen}}20140606-Minimal
| {{result|pass|pwhalen}}20140606-Minimal{{result|pass|pwhalen}}20140623-Minimal
| {{result|none}}
| {{result|none}}
| <references/>  
| <references/>  
Line 96: Line 96:


[[Category:Base_validation_testing]]
[[Category:Base_validation_testing]]
[[Category:Fedora 21 Rawhide Test Results|B]]
[[Category:Fedora 21 Nightly Test Results|B]]

Latest revision as of 23:10, 8 December 2014

This page records base validation testing test results for Rawhide nightly builds in the month of June 2014.

Not mandatory testing
This testing is not in any way mandatory, but rather precautionary / exploratory. No Fedora release depends on this testing being performed. Due to the longer Fedora 21 release cycle, we think it is a good idea to keep an eye on the status of Rawhide before we reach the Alpha milestone, however no-one should feel obliged in any way to participate in this testing, and certainly not to work excessively long, late or hard on it.

How to test[edit]

  1. Download the required Rawhide nightly live image, x86_64 or i386 network install image, or ARM or Cloud disk image. If you use the network install image, enter the date on which you downloaded it as the 'build date' in your results (see #Key). Some tests may specify use of either a traditional installer image (the DVD image, or a net install image) or a live image; please follow these specifications. Note that if generation of the boot.iso images fails for a given day, the file will not be present on the mirrors (the last successful compose is not retained). In this case, you just have to wait for a successful compose, or use one of the other images/methods described on this page.
  2. Perform one or more of the test cases and add your results to the table below.
  3. If a test fails, file a bug report, and propose the bug as a blocker for the appropriate release (see blocker bug process). If you notice a problem during a test which does not constitute a complete failure of the test, you should still file a bug report, but it may not be appropriate to propose it as a blocker. Use your judgment in deciding this, with reference to the Fedora_Release_Criteria, which these tests are intended to verify. If you are unsure, err on the side of proposing the bug as a blocker.
  4. Don't install updates before performing any of the tests, as when you are testing pre-releases, available updates are not part of the proposed released package set.
Virtual machine testing
In most cases, testing in a virtual machine is OK.

Add or Remove a Test Case[edit]

  1. Please request review for your changes by publishing your test case for review to test@lists.fedoraproject.org.
  2. Once reviewed, make your changes to any current documents that use this template (e.g. Test Results:Fedora 21 Rawhide 2014 06 Base)
  3. Lastly, update QA:Base validation results template with the same changes.

Key[edit]

See the table below for a sample format for test results. All test results are posted using the format specified Template:Result.


Test Results Format
Test Result Explanation Code Entered
none
Untested - This test has not been run, and is available for anyone to contribute feedback. {{result|none}}
Pass pass robatino
20140615
Passed - The test has been run and the tester determine the test met the expected results. The date of the build tested (not the date the test was run) should be included. {{result|pass|robatino}} 20140615
Inprogress inprogress adamwill
Inprogress - An inprogress result is often used for tests that take a long time to execute. Inprogress results should be temporary and change to pass, fail or warn. {{result|inprogress|adamwill}}
Fail fail jlaska [1] [2]
20140615
Failed - Indicates a failed test. A link to a bug must be provided. See Template:Result for details on providing bug information. The date of the build tested (not the date the test was run) should be included.
  1. RHBZ #XYZ
  2. RHBZ #ZXY
{{result|fail|jlaska|XYZ|ZXY}} 20140615
Warning warn rhe
[1] 20140615
Warning - This test completed and met the expected results of the test, but other issues were encountered during testing that warrant attention. The date of the build tested (not the date the test was run) should be included.
  1. Brief description about the warning status
{{result|warn|rhe}} <ref>Brief description about the warning status</ref> 20140615
Pass pass hongqing
20140615
Warning warn kparal
20140618
Multiple results - More people can easily provide results to a single test case. {{result|pass|hongqing}} 20140615 {{result|warn|kparal}} 20140618
Unsupported - An unsupported test or configuration. No testing is required.

Test Matrix[edit]

Release Level Test Case x86(_64) ARM Cloud References
Alpha QA:Testcase_base_initial_setup
none
Pass pass pwhalen
20140604-Minimal
Pass pass pwhalen
20140606-Minimal
Pass pass pwhalen
20140608-Workstation
Pass pass pwhalen
20140623-Minimal
Alpha QA:Testcase_base_startup
none
Pass pass pwhalen
20140602
none
Alpha QA:Testcase_base_system_logging
none
Pass pass pwhalen
20140606-Minimal
Pass pass pwhalen
20140623-Minimal
none
Final QA:Testcase_Services_start
none
Pass pass pwhalen
20140606-Minimal
Pass pass pwhalen
20140623-Minimal
none