From Fedora Project Wiki

Line 213: Line 213:
* Mailing list:
* Mailing list:
* Developers and QA will be available on test day at [[QA/Fedora_12_test_days]]
* Developers and QA will be available on test day at [[QA/Fedora_12_test_days]]
** Reference of ways to communicate at: [[Communicate]]
* Reference of ways to communicate at [[Communicate]]
== Test Environment/Configs ==
== Test Environment/Configs ==

Revision as of 03:31, 10 June 2009

Fedora 12 Installation Test Plan

Revision history

Date Revision Comment
Template:Void4 June 2009 0.1 Initial version


This document describes the tests that will be created and used to verify the installation of Fedora 12.

The goals of this plan are to:

  • Organize the test effort
  • Communicate the strategy, scope and priorities of the planned tests to all relevant stake-holders for their input and approval
  • Serve as a base for the test planning for future Fedora releases

Test Strategy

Instead of outlining all possible installation inputs and outputs, this test plan will focus on defining inputs and outputs at different stages in anaconda. This will also allow different tests to be performed independently during a single installation. For example, one may execute a kickstart delivery via HTTP, raid0 partitioning using 3 physical disks, and a minimal package installation on a para-virtualized xen guest all in single installation. Scenarios where the stages are dependent will be indicated as such in the test case.

Where possible, SNAKE will be used to automate and aid in reproducibility.

Test Priority

This test plan will use a 3 tier classification for test execution priority.

Tier1 is intended to verify that installation is possible on common hardware using common use cases. Verification includes:

  • Common boot media
  • Common Installation source
  • Installation using defaults installation options
  • Default Partitioning

Tier2 takes a step further to include more use cases. Tier2 verification consists of:

  • All boot media
  • All installation sources
  • All kickstart delivery methods
  • Some architecture specific verification

Lastly, Tier3 captures the remaining identified use cases:

  • More exhaustive partitioning schemes
  • More complex networking scenarios
  • More architecture specific verification
  • Network device
  • Storage device
  • Upgrade testing


Testing will include:

  • Various methods of booting the installation program
  • Manual and kickstart execution of the installation program
  • System setup performed by the installation program (networking, modprobe.conf, bootloader, runlevel)
  • Booting the installed system

Items outside the scope of this test plan include:

  • Functional verification of software installed on the system
  • Installation from media not generated by fedora release engineering

Test Pass/Fail Criteria

Entrance criteria

  • Trees must be generated using release engineering tools (not hand crafted)
  • There must be no unresolved dependencies for packages included in the installation tree
  • There must be no dependency conflicts for packages included in the installation tree
  • Any changes in composition of the installation tree are explainable by way of bugzilla

Alpha criteria

  • Entrance criteria have been met
  • All tier#1 tests have been executed

Beta criteria

  • Alpha criteria have been met
  • All tier#1 tests pass
  • All tier#2 tests have been executed

GA criteria

  • Beta criteria have been met
  • All test tiers must pass
  • Any open defects have been documented as release notes

Test Deliverables

  • This test plan
  • A test summary document for each major milestone
  • A list of defects filed
  • Any test scripts used for automation or verification

Test Cases (Functional)

Need confirmation on the following list of features for F12

The following list of features was obtained from Anaconda/Features. Test plans for these features will be designed/developed on each feature page.

Test Cases (Non-Functional)

Image Sanity
Tests to check sums of Media
QA:Testcase_Mediakit_ISO_Size QA:Testcase_Mediakit_ISO_Checksums QA:Testcase_Mediakit_Repoclosure
Boot Methods
Tested designed to validate the bootable media
QA/TestCases/BootMethodsBootIso QA/TestCases/BootMethodsCdrom QA/TestCases/BootMethodsDvd
QQA:Testcase_efidisk.img QA/TestCases/BootMethodsPxeboot QA/TestCases/BootMethodsNetboot
QA/TestCases/BootMethodsXenParaVirt QA/TestCases/BootMethodsKVM
Installation Source
The media booted and the installation source used aren't always the same. These tests verify installation using the described source.
QA/TestCases/InstallSourceHttp QA/TestCases/InstallSourceNfs QA/TestCases/InstallSourceFtpAnonymous
QA/TestCases/InstallSourceFtpNonAnonymous QA/TestCases/InstallSourceCdrom QA/TestCases/InstallSourceDvd
QA/TestCases/InstallSourceHardDrive QA/TestCases/InstallSourceNfsIso QA:TestCases/Install Source Live Image
Kickstart Delivery
Tests to validate acquiring a kickstart script through supported methods.
QA/TestCases/KickstartKsFilePathKsCfg QA/TestCases/KickstartKsHdDevicePathKsCfg QA/TestCases/KickstartKsHttpServerKsCfg
Package Sets
Designed to exercise the most common package dependency and conflict pathways.
QA/TestCases/PackageSetsDefaultPackageInstall QA/TestCases/PackageSetsMinimalPackageInstall
The more common partitioning scenarios. These cases ensure that anaconda (and friends) prepare the disk for post-install booting as directed.
QA:Testcase_Anaconda_autopart_install QA:Testcase_Anaconda_autopart_(encrypted)_install QA:Testcase_Anaconda_autopart_(shrink)_install
QA:Testcase_Anaconda_autopart_(use_free_space)_install QA/TestCases/PartitioningExt4OnNativeDevice QA/TestCases/PartitioningExt3OnNativeDevice
QA/TestCases/PartitioningExt2OnNativeDevice QA/TestCases/PartitioningRootfsOnLvmDevice QA/TestCases/PartitioningRootfsOnRaid1
QA/TestCases/PartitioningNoSwap QA/TestCases/PartitioningRaid0OnLvmDevice QA/TestCases/PartitioningSwapOnLvmDevice
QA/TestCases/PartitioningUninitializedDisks QA/TestCases/PartitioningUsrOnRaid0 QA/TestCases/PartitioningUsrOnRaid5
QA/TestCases/PartitioningUsrOnRaid6 QA/TestCases/PartitioningRootfsOnDmraidDevice QA/TestCases/PartitioningPreExistingLvm2Lvm2
When stuff goes wrong ... we need to be able to handle it reasonably well.
QA:Testcase_Anaconda_rescue_mode QA:Testcase_Anaconda_updates.img_via_URL QA:Testcase_Anaconda_updates.img_via_installation_source
QA:Testcase_Anaconda_updates.img_via_local_media QA:Testcases Anaconda save traceback to remote system QA:Testcases Anaconda save traceback to bugzilla
QA:Testcases Anaconda save traceback to disk QA:Testcases Anaconda traceback debug mode
Storage Devices
Are we probing and booting post-install properly in the following scenarios?
QA/TestCases/StorageDeviceSata QA/TestCases/StorageDeviceScsi QA:Testcase_Install_to_Pata_Device
QA:Testcase Anaconda iSCSI no authentication QA:Testcase_Anaconda_partitioning_dmraid_rootfs
User Interface
Anaconda provides several user-interfaces for installation, the following cases are designed to ensure the desired interface operates as expected
QA/TestCases/UserInterfaceGraphical QA/TestCases/UserInterfaceText QA/TestCases/UserInterfaceVnc
QA/TestCases/UserInterfaceCmdline QA/TestCases/UserInterfaceTelnet
Upgrade system
Tests to validate system upgrade
QA:Testcase_Anaconda_Upgrade_New_Bootloader QA:Testcase_Anaconda_Upgrade_Skip_Bootloader QA:Testcase_Anaconda_Upgrade_Update_Bootloader
QA:Testcase_Anaconda_Upgrade_Encrypted_Root QA:Testcase_Preupgrade QA:Testcase_Preupgrade_from_older_release

Reporting Bugs and Debugging Problems


Test Environment/Configs


  • i386
  • ppc
  • x86_64


  • The Fedora 12 schedule is available at Releases/12/Schedule
  • The Fedora 12 test day is available at QA/Test_Days/F12
  • Each major milestone will demand a full regression run (Alpha, Beta, PreviewRelease)