The purpose of AutoQA install test automation is to simplify testing, reduce the test execution time and improve efficiency. The AutoQA install test project should address the following problems:
- The system should be easy to allow for tester and developer use
- Have clear documentation for customizing and creating new tests
- Support test execution using existing Fedora infrastructure services, but not require them
- Test results are easy to verify
The Fedora installer is a complicated software application that often requires significant setup time to properly and efficiently test. Installer failures typically come from the following areas:
- ISO file size is too large (or small)
- Invalid SHA256 checksum
- Invalid implanted ISO md5sum
- Install environment - anaconda has specific application, library and config file format requirements.
- Versions check
- Boot media improperly built (PXE, boot.iso, CD/DVD, efidisk.img)
- Installer fails to boot as a KVM guest
- Installer fails to boot as a XEN guest
- Unable to detect install.img media
- Unable to transition to stage#2 installer
- Ks.cfg could not be obtained from specified location (http, ftp, nfs, hd, initrd)
- Install fails to proceed in accordance with the directives in the ks.cfg file
- Install improperly sets up networking based on command-line and kickstart network parameters (boot with
- X driver problems while transitioning to graphical install
- Screen corruption during text-mode install
- VNC fails to start
- Serial console redirection improperly setup
- Fail to detect existing storage device(s)
- Failure to clear stale data off of existing devices
- Unable to add iSCSI volumes
- Failure detecting existing partition scheme (lvm, mdraid, dmraid, luks)
- Failure when attempting to resize existing partitions
- Failures while attempting to re-use existing partitions
- Improperly clearing stale information from disks
- Unable to consistently resize an existing filesystem
- General failures while attempting to manually partition a system
- Unable to read metadata from package repositories (http, ftp, nfs, media)
- Failures while adding or modifying existing package repositories
- Network timeout while retrieving packages
- Dependency problems while resolving package list
- File conflicts during package install
- Package order and install errors in
- Improperly formatted
Boot loader configuration
- Unable to properly detect other operating systems
- Failure while setting up chainloader for another OS
- Failure to detect previously installed systems
- Errors while attempting to update bootloader configuration during upgrade
- Package upgrade errors in
- Rescue mode fails to detect existing installations (lvm, raid, luks)
- Rescue mode fails to establish networking
- Problems saving traceback information (local disk, bugzilla, remote server)
- Anaconda unable to download and use updates.img (from install source, local media or URL)
- Unable to transition to debug mode
First, in order to provide a consistent and documented test approach, the existing Fedora Install test plan  will be revisited. The test plan will be adjusted to ensure proper test coverage for the failure scenarios listed above. Existing test cases will be reviewed for accuracy. New test cases will be created using the Template:QA/Test_Case template. Finally, the test plan will be adjusted to match the improved Fedora Release Criteria . This includes adjusting the test case priority to match milestone criteria.
Next, in order to reduce the setup/execution time, improve efficiency and to provide test results on a more consistent basis, a subset of test cases will be chosen for automation. Tests will be written in python and will be developed and executed on a system supporting KVM virtualization. Test scripts will be responsible for preparing a virtual install environment, initiating a kickstart install and validating the results. Once an initial batch of tests exist, they will be formally integrated into the AutoQA project.
Last, a method will be developed for collecting test results into a single test result matrix. Results may be posted to the wiki directly, or a custom turbogears application may be needed to display results . The results will be easily accessible for testers and the installer development team.
- Any testers from community can download and install the automation framework and run test cases
- After finish testing, testers will post test result to specified wiki
- Revise Fedora Install test plan to ensure test coverage exists for failure scenarios listed above
- Create scripts to initiate
- use virtual machine(libvirt+KVM) to finish install tests one by one.
- provide simple interface to customize testing.
- Create scripts to do test automatically for anaconda
- Create test cases for the above possible breakness of anaconda(DVD,CDROM,BootCD,LiveCD)
- Test result display page help to check the test result easily,the result format is good for testers to post to specified link(have no need to change a lot before post).
- Write test cases From simple to complex, kickstart to graphic.
- AutoQA must be packaged and available in Fedora infrastructure
- A valid backend test harness is packaged and available in Fedora infrastructure (Autotest currently being used by AutoQA)
- Knowledge of TurboGears or TurboGears2
- A new AutoQA test watcher may be required to initiate automated tests whenever ISO images are available for testing (post-iso-build)
- Provide links to TRAC roadmaps used to monitor progress
- Goal: Create a basic system, make it simple and run successfully.
- Case: DVD input and support simplest test cases.
- Platform: Virtual Machine.
- Inputs and Outputs:
- Inputs: DVD, kickstart files, python (shell) scripts.
- Outputs: Logs of whole process(including anaconda logs), test results.
- Approach<<Fix Me>>:
- Create a server, and prepare DVD.iso.
- Create a virtual machine by the server.
- Run test cases using some kickstarts for install tests.
- Send back some logs, successful or not...
- Have some "results parsers" which waits for the logs from the clients and then parses them
- Frameworks studied so far: kvm-autotest, autotest, libvirt, dogtail.
- Graphical automation testing
- The above autotest is based on this framework: http://autotest.kernel.org/
- The steps to work with step files: http://www.linux-kvm.org/page/KVM-Autotest/Steps
- List unresolved or active discussion
To leave a comment, use the Talk page for this proposal.
Fedora QA team
This project meeting is held weekly for progress. Welcome everyone to attend it and give your valuable suggestions.
- IRC on #fedora-meeting
- Each Friday at 8:30 UTC (9:30 CET and 16:30 Bejing)