From Fedora Project Wiki

< QA‎ | TestPlans

Fedora 9 Installation Test Plan

Revision history

Date Revision Comment
Template:Void10 December 2007 0.1 Initial version


This document describes the tests that will be created and used to verify the installation of Fedora9

The goals of this plan are to:

  • Organize the test effort
  • Communicate the strategy, scope and priorities of the planned tests to all relevant stake-holders for their input and approval
  • Serve as a base for the test planning for future Fedora releases

Test Strategy

Instead of outlining all possible installation inputs and outputs, this test plan will focus on defining inputs and outputs at different stages in anaconda. This will also allow different tests to be performed independently during a single installation. For example, one may execute a kickstart delivery via HTTP, raid0 partitioning using 3 physical disks, and a minimal package installation on a para-virtualized xen guest all in single installation. Scenarios where the stages are dependent will be indicated as such in the test case.

Where possible, SNAKE will be used to automate and aid in reproducibility.

Test Priority

This test plan will use a 3 tier classification for test execution priority.

Tier1 is intended to verify that installation is possible on common hardware using common use cases. Verification includes:

  • Common boot media
  • Common Installation source
  • Installation using defaults installation options
  • Default Partitioning

Tier2 takes a step further to include more use cases. Tier2 verification consists of:

  • All boot media
  • All installation sources
  • All kickstart delivery methods
  • Some architecture specific verification

Lastly, Tier3 captures the remaining identified use cases:

  • More exhaustive partitioning schemes
  • More complex networking scenarios
  • More architecture specific verification
  • Network device
  • Storage device
  • Upgrade testing


  • Testing will include:
  • Various methods of booting the installation program
  • Manual and kickstart execution of the installation program
  • System setup performed by the installation program (networking, modprobe.conf, bootloader, runlevel)
  • Booting the installed system
  • Items outside the scope of this test plan include:
  • Functional verification of software installed on the system
  • Installation from media not generated by fedora release engineering

Test Pass/Fail Criteria

  • Entrance criteria
    • Trees must be generated using release engineering tools (not hand crafted)
    • There must be no unresolved dependencies for packages included in the installation tree
    • There must be no dependency conflicts for packages included in the installation tree
    • Any changes in composition of the installation tree are explainable by way of bugzilla
  • Alpha criteria
    • Entrance criteria have been met
    • All tier#1 tests have been executed
  • Beta criteria
    • Alpha criteria have been met
    • All tier#1 tests pass
    • All tier#2 tests have been executed
  • GA criteria
    • Beta criteria have been met
    • All test tiers must pass
    • Any open defects have been documented as release notes

Test Deliverables

  • This test plan
  • A test summary document for each major milestone
  • A list of defects filed
  • Any test scripts used for automation or verification

Test Cases (Functional)

Test Cases (Non-Functional)


Install Source

Test Case Environment(s)
QA/TestCases/InstallSourceHttp all
QA/TestCases/InstallSourceNfs all
QA/TestCases/InstallSourceNfsIso all

Package Sets

Test Case Environment(s)
QA/TestCases/PackageSetsDefaultPackageInstall all
QA/TestCases/PackageSetsMinimalPackageInstall all


Test Case Environment(s)
QA/TestCases/PartitioningExt3OnNativeDevice all
QA/TestCases/PartitioningRootfsOnLvmDevice all
QA/TestCases/PartitioningRootfsOnRaid1 all

User Interface

Test Case Environment(s)
QA/TestCases/UserInterfaceGraphical all
QA/TestCases/UserInterfaceText all
QA/TestCases/UserInterfaceVnc all


Boot Methods

Test Case Environment(s)
QA/TestCases/BootMethodsBootIso all
QA/TestCases/BootMethodsCdrom all
QA/TestCases/BootMethodsDvd all
QA/TestCases/BootMethodsUsb i386, x86_64
QA/TestCases/BootMethodsNetboot ppc
QA/TestCases/BootMethodsPxeboot i386, x86_64
QA/TestCases/BootMethodsXenParaVirt x86_64
QA/TestCases/BootMethodsRescueMode all

Installation Source

Test Case Environment(s)
QA/TestCases/InstallSourceCdrom all
QA/TestCases/InstallSourceDvd all
QA/TestCases/InstallSourceFtpAnonymous all
QA/TestCases/InstallSourceHardDrive all

Kickstart Delivery

Test Case Environment(s)
QA/TestCases/KickstartKsFilePathKsCfg all
QA/TestCases/KickstartKsHdDevicePathKsCfg all
QA/TestCases/KickstartKsHttpServerKsCfg all
QA/TestCases/KickstartKsNfsServerPathKsCfg all

Package Sets

Test Case Environment(s)
QA/TestCases/PackageSetsEverything all


Test Case Environment(s)
QA/TestCases/PartitioningExt2OnNativeDevice all
QA/TestCases/PartitioningNoSwap all
QA/TestCases/PartitioningRaid0OnLvmDevice all
QA/TestCases/PartitioningSwapOnLvmDevice all
QA/TestCases/PartitioningUninitializedDisks all
QA/TestCases/PartitioningUsrOnRaid0 all
QA/TestCases/PartitioningUsrOnRaid5 all
QA/TestCases/PartitioningUsrOnRaid6 all

Storage Devices

Test Case Environment(s)
QA/TestCases/StorageDeviceSata all
QA/TestCases/StorageDeviceScsi all

User Interface

Test Case Environment(s)
QA/TestCases/UserInterfaceCmdline all
QA/TestCases/UserInterfaceTelnet all


Boot Methods

Test Case Environment(s)
QA/TestCases/BootMethodsInstallFromBootIsoIfs ppc
QA/TestCases/BootMethodsInstallFromNetbootImgAsStmf ppc
QA/TestCases/BootMethodsInstallFromVirtualCdromIfs ppc
QA/TestCases/BootMethodsRescuecdIsoCdrom all

Installation Source

Test Case Environment(s)
QA/TestCases/InstallSourceFtpNonAnonymous all
QA/TestCases/InstallSourceHttpIpv6 all

Kickstart Delivery

Package Sets


Test Case Environment(s)
QA/TestCases/PartitioningPreExistingLvm2Lvm2 all
QA/TestCases/PartitioningPreExistingRaidRaid all

Storage Devices

Test Case Environment(s)
QA/TestCases/StorageDeviceiScsi all

User Interface

Test Case Environment(s)
QA/TestCases/UserInterfaceSerial all


Test Case Environment(s)
QA/TestCases/UpdatesImgPrompt x86_64
QA/TestCases/UpdatesImgViaTree x86_64
QA/TestCases/UpdatesImgViaHttp x86_64
QA/TestCases/UpdatesImgViaUsb x86_64
QA/TestCases/TracebackSaveRemote x86_64
QA/TestCases/TracebackDebugMode x86_64

Test Environment/Configs

  • Hardware
  • i386
  • ppc
  • x86_64
  • Hardware (subject to secondary arch availability)
  • ia64
  • s389x


  • who's doing what


  • when are they doing it

Risks and Contingencies

  • what might go wrong and how we'll handle it


Date Approver Comment
Template:Void10 December 2007 JamesLaska I approve this message



  • Outstanding issues
  • How do we collect test feedback?
  • Option 1: privileged users can modify wiki directly
  • Option 2: email (eeew) ... probably going to have a bit of this
  • ... ?
  • How do we present test results?
  • Separate wiki page / test plan ... a test summary report ?
  • A application to store and query test results?