From Fedora Project Wiki

Revision as of 09:36, 16 December 2009 by Rhe (talk | contribs) (→‎Meeting)

QA.png


Warning.png
This page is a draft only
It is still under construction and content may change. Do not rely on the information on this page.


Overview

The purpose of AutoQA install test automation is to simplify testing, reduce the test execution time and improve efficiency. The AutoQA install test project should address the following problems:

  • The system should be easy to allow for tester and developer use
  • Have clear documentation for customizing and creating new tests
  • Support test execution using existing Fedora infrastructure services, but not require them
  • Test results are easy to verify

Problem Space

The Fedora installer is a complicated software application that often requires significant setup time to properly and efficiently test. Installer failures typically come from the following areas:

Image Sanity

  1. ISO file size is too large (or small)
  2. Invalid SHA256 checksum
  3. Invalid implanted ISO md5sum
  4. Install environment - anaconda has specific application, library and config file format requirements.
  5. Versions check

Boot Methods

  1. Boot media improperly built (PXE, boot.iso, CD/DVD, efidisk.img)
  2. Installer fails to boot as a KVM guest
  3. Installer fails to boot as a XEN guest

Install Source

  1. Unable to detect install.img media
  2. Unable to transition to stage#2 installer

Kickstart Delivery

  1. Ks.cfg could not be obtained from specified location (http, ftp, nfs, hd, initrd)
  2. Install fails to proceed in accordance with the directives in the ks.cfg file
  3. Install improperly sets up networking based on command-line and kickstart network parameters (boot with ksdevice=eth1, ks.cfg contains eth2)

User Interface

  1. X driver problems while transitioning to graphical install
  2. Screen corruption during text-mode install
  3. VNC fails to start
  4. Serial console redirection improperly setup

Storage Devices

  1. Fail to detect existing storage device(s)
  2. Failure to clear stale data off of existing devices
  3. Unable to add iSCSI volumes

Partitioning

  1. Failure detecting existing partition scheme (lvm, mdraid, dmraid, luks)
  2. Failure when attempting to resize existing partitions
  3. Failures while attempting to re-use existing partitions
  4. Improperly clearing stale information from disks
  5. Unable to consistently resize an existing filesystem
  6. General failures while attempting to manually partition a system

Package Repository

  1. Unable to read metadata from package repositories (http, ftp, nfs, media)
  2. Failures while adding or modifying existing package repositories

Package Set

  1. Network timeout while retrieving packages
  2. Dependency problems while resolving package list
  3. File conflicts during package install
  4. Package order and install errors in install.log
  5. Improperly formatted comps.xml data

Boot loader configuration

  1. Unable to properly detect other operating systems
  2. Failure while setting up chainloader for another OS

Upgrade system

  1. Failure to detect previously installed systems
  2. Errors while attempting to update bootloader configuration during upgrade
  3. Package upgrade errors in upgrade.log

Recovery

  1. Rescue mode fails to detect existing installations (lvm, raid, luks)
  2. Rescue mode fails to establish networking
  3. Problems saving traceback information (local disk, bugzilla, remote server)
  4. Anaconda unable to download and use updates.img (from install source, local media or URL)
  5. Unable to transition to debug mode

Proposed Solution

First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan [1] will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria [2]. This includes adjusting the test case priority to match milestone criteria.

Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.

Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results [3]. The results will be easily accessible for testers and the installer development team.

Scope

  • Create test cases for the above possible breakness of anaconda (DVD,CDROM,BootCD,LiveCD)
  • Revise Fedora Install test plan to ensure test coverage exists for failure scenarios listed above
  • Any testers from community can download and install the automation framework and run test cases
  • After finish testing, testers will post test result to specified wiki
  • Create python scripts to prepare KVM-based virtual environments for testing, initiate kickstart installs, and validate outcome
  • Provide simple interface to customize testing.
  • Test result display page help to check the test result easily,the result format is good for testers to post to specified link(have no need to change a lot before post).
  • Write test cases From simple to complex, kickstart to graphic.

Active Ingredients

  • AutoQA must be packaged and available in Fedora infrastructure
  • A valid backend test harness is packaged and available in Fedora infrastructure (Autotest currently being used by AutoQA)
  • Knowledge of TurboGears or TurboGears2
  • A new AutoQA test watcher may be required to initiate automated tests whenever CD and DVD ISO images are available for testing (post-iso-build)

Roadmap

  • Provide links to TRAC roadmaps used to monitor progress

Results

Version 1.0

  • Goal: Create a basic system, make it simple and run successfully.
  • Case: DVD input and support simplest test cases.
  • Platform: Virtual Machine.
  • Inputs and Outputs:
    • Inputs: DVD, kickstart files, python (shell) scripts.
    • Outputs: Logs of whole process(including anaconda logs), test results.
  • Approach<<Fix Me>>:
  1. Create a server, and prepare DVD.iso.
  2. Create a virtual machine by the server.
  3. Run test cases using some kickstarts for install tests.
  4. Send back some logs, successful or not...
  5. Have some "results parsers" which waits for the logs from the clients and then parses them
  • Frameworks studied so far: kvm-autotest, autotest, libvirt, dogtail.

Version 2.0

  • Graphical automation testing
  1. http://www.linux-kvm.org/page/KVM-Autotest
  2. The above autotest is based on this framework: http://autotest.kernel.org/
  3. The steps to work with step files: http://www.linux-kvm.org/page/KVM-Autotest/Steps

Discussion Points

  • List unresolved or active discussion

Comments?

To leave a comment, use the Talk page for this proposal.

Owner

Fedora QA team

References

Meeting

This project meeting is held weekly for progress. Welcome everyone to attend it and give your valuable suggestions.