No longer currentThe compose for which this page contains results is no longer the current one.
This page contains the results for the current compose.
This page records Installation validation testing test results for the Fedora 33 20200811.n.0 nightly compose.
Which tests to run
Test coverage pageThis page provides information about test coverage for the tests on this page across all the composes for the current release: it can help you see which test cases most need to be run.
Tests with a Milestone of Basic, Beta or Final are the most important. Optional tests are less so, but still useful to run. The milestone indicates that for that milestone release or a later one to be approved for release, the test must have been run against the release candidate build (so at Beta, all Basic and Beta tests must have been run, for instance). However, it is important to run the tests for all milestones as early and often as possible. Please refer to the test coverage page linked above and try to find and run tests which have not been run at all, or not run recently, for the current release.
How to test
1. Download one or more media for testing:
2. Perform one or more of the test cases and add your results to the table below
- You can submit results by editing the page directly, or by using relval, with the
relval report-results
command. It provides a simple text interface for reporting test results.
3. If a test fails, file a bug report. You may propose the bug as a release blocker or freeze exception bug for the appropriate release - see blocker bug process and freeze exception bug process.
Some tests must be run against particular Products or images - for example, the #Default boot and install tests. If no particular product or image is specified either in this page or the test case page, you can use any appropriate image. For example, you can run most of the #General Tests with the Workstation live image, or either of the Server install images.
If you notice a problem during a test which does not constitute a complete failure of the test, you should still file a bug report, but it may not be appropriate to propose it as a release blocker bug. Use your judgment in deciding this, with reference to the Fedora_Release_Criteria, which these tests are intended to verify. If you are unsure, err on the side of proposing the bug as a blocker.
Don't install updates
Don't install updates before performing any of the tests, as when you are testing pre-releases, available updates are not part of the proposed released package set.
Results summary page
The Test Results:Fedora 33 Rawhide 20200811.n.0 Summary page contains the results from this page and all the other validation pages for the same compose listed together to provide an overview.
Add, Modify or Remove a Test Case
- Please request review for your changes by publishing your test case for review to the test mailing list and/or the appropriate working group mailing list (e.g. server, cloud, or desktop).
- Once reviewed, make your changes to any current documents that use the template (e.g. Test_Results:Current_Installation_Test).
- Lastly, update Template:Installation_test_matrix with the same changes.
Key
See the table below for a sample format for test results. All test results are posted using the result template.
Test Result |
Explanation |
Code Entered
|
none
|
Untested - This test has not been run, and is available for anyone to contribute feedback.
|
{{result|none}}
|
pass robatino
|
Passed - The test has been run and the tester determine the test met the expected results
|
{{result|pass|robatino}}
|
|
Inprogress - An inprogress result is often used for tests that take a long time to execute. Inprogress results should be temporary and change to pass, fail or warn.
|
{{result|inprogress|adamwill}}
|
|
Failed - Indicates a failed test. A link to a bug must be provided. See Template:Result for details on providing bug information.
|
{{result|fail|jlaska|XYZ|ZXY}}
|
[1]
|
Warning - This test completed and met the expected results of the test, but other issues were encountered during testing that warrant attention.
- ↑ Brief description about the warning status
|
{{result|warn|rhe}} <ref>Brief description about the warning status</ref>
|
|
Multiple results - More people can easily provide results to a single test case.
|
{{result|pass|hongqing}} {{result|warn|kparal}}
|
|
Failed - Same issue with LVM again
|
{{result|fail|pboy|2246871|2244305}}
|
pass previous <build> run
|
Result from previous test run - This test result is directly moved from the test run of previous <build>.
|
{{result|pass|previous <build> run}}
|
|
Unsupported - An unsupported test or configuration. No testing is required.
|
|
Test Matrix
Please click [show] in each table to view the tests of each media installation, and click [edit] to post your test results using the syntax in Key Section.
Image sanity
Details
The result column titles are variants in Pungi/productmd parlance. For each variant, the checksums for all images in that variant can be checked; the maximum size for all images in that variant which have a maximum size can be checked; and repoclosure and fileconflicts can be checked for any DVD image in that variant. Any failure for any tested image should be filed as a bug and reported as a failure here. Please provide the bug ID and a short note of exactly which image(s) failed as a comment.
- ↑ Everything boot armhfp, size 758251520, max 734003200
- ↑ Everything boot x86_64, size 741343232, max 734003200
- ↑ Server boot armhfp, size 758355968, max 734003200
- ↑ Server boot x86_64, size 741343232, max 734003200
- ↑ Workstation live ppc64le, size 2002489344, max 2000000000
- ↑ Workstation live x86_64, size 2046820352, max 2000000000
- ↑ LXQt live x86_64, size 1499627520, max 1400000000
- ↑ SoaS live x86_64, size 1179549696, max 734003200
- ↑ Astronomy_KDE live x86_64, size 4296112128, max 4000000000
- ↑ Python_Classroom live x86_64, size 2116141056, max 2000000000
- ↑ Scientific_KDE live x86_64, size 4106403840, max 3300000000
- ↑ Security live x86_64, size 2044821504, max 2000000000
Default boot and install
Single test tableIn all of these tests, the test case used is
QA:Testcase_Boot_default_install. That is where the links point. The same test needs to be run for multiple images, target platforms, and install media. Note that the non-installer-based ARM disk images are covered by the later
#ARM disk images section. The
VM columns are for results from testing in a virtual machine. The
CD/DVD columns are for results from testing on a real system with the image written to a real CD or DVD. The
USB columns are for results from testing on a real system with the image written to a USB stick.
Expected coverageFor Beta, we expect a reasonable sampling of tests across the table, with at least some testing for VM and USB boot method, both firmware types, and each major class of deliverable (netinst, live and DVD). For Final, we expect full coverage for
Basic / Final rows with VM and USB boot method. Optical boot testing from physical media in Final is optional (but blocking if issues are found) for
supported images.
Fedora Media Writer
Milestone |
Test Case |
Fedora 31 |
Fedora 32 |
Fedora 33 |
Windows 10 |
macOS
|
Beta / Optional
|
QA:Testcase_USB_fmw
|
none
|
none
|
none
|
none
|
none
|
ARM disk images
Single test tableIn all of these tests, the test case used is
QA:Testcase_arm_image_deployment. That is where the links point. The same test needs to be run for multiple images and target platforms.
PXE boot
Virtualization
Storage devices
Guided storage configuration
Guided storage shrinking
Environments
For this test, the column headings refer to the storage volume type to be shrunk, not the one chosen to replace it for the new installation.
Custom storage configuration
Advanced custom storage configuration
User interface
Installation repositories
Package sets
Kickstart
Upgrade
Internationalization and Localization
Miscellaneous