From Fedora Project Wiki

No edit summary
 
(20 intermediate revisions by one other user not shown)
Line 1: Line 1:
{{admon/note|Work in Progress|These are Leah's notes on Usability Testing. This is a work in progress and suggestions are welcome. Please use the ''discussion'' link above.}}
{{admon/note|Work in Progress|These are Leah's notes on Usability Testing. This is a work in progress and suggestions are welcome. Please use the ''discussion'' link above.}}


=== What you need to do Usability Testing ===
== Why Usability Testing? ==
''Compelling intro paragraph here.''
 
== Usability Testing Terms ==
* '''Participant''' - the person taking the usability test
* '''Tester''' - the person running the usability test
* '''Thinking Aloud''' - the participant talks through what s/he is doing for the benefit of the recording
* '''Active Intervention'''  - the tester asks questions, such as "What would you do next?" and "What do you think about the menu structure here?"
* '''Remote Testing ''' - usability testing done over a long-distance, where the tester and the participant are not in the same room, but communicate via some audio device
* '''Test Scenario '''- a written script of what task the participant is to perform during the usability test, should encompass the intended use of the application
 
== What you need to do Usability Testing ==
* A '''tester''' (to run the test)
* A '''tester''' (to run the test)
** Two Copies of the ''Consent Form'' (one for tester, one for participant)
** Two Copies of the ''Consent Form'' (one for tester, one for participant)
*** ''this might be covered by the CLA - check with legal?''
*** ''this might be covered by the CLA - check with legal?''
** Signup Form
** Signup Form
*** This gets information about the participant, as described above
** A written Test Scenario
** A written Test Scenario
** Paper/laptop for Notes
** Paper/laptop for Notes
* A '''computer''' (with the application/website you're testing)
* A '''computer''' (with the application/website you're testing)
* A '''participant'''
* A '''participant'''
* A '''videocamera'' or '''screen/audio capture program''
* A '''videocamera''' or '''screen/audio capture program'''
* A '''room where you won't be interrupted during the test'''
* A '''room where you won't be interrupted during the test'''
== Creating a Usability Test ==
=== Basic Order ===
# '''Choose Application/Website:''' Choose an application/website you want to test
# '''Intended Use:''' Determine the intended use of this application/website
# '''Intended Audience:''' Determine the intended audience for the application (general user, power user, sys admin, other)
# '''Create a Test Scenario:''' Create a test scenario for the participant to follow
# '''Recruit Participants:''' Five is a good number.  You can strong-arm friends and family, recruit online, or just get some co-workers on a slow afternoon
# '''Run the Usability Test:''' Yes, do that.
EXAMPLE:
# Application: Software Updater
# Intended Use: Updating the system, which involves downloading and installing new packages
# Intended Audience: This applies to all users, so general users are the participant choice
# Test Scenario: Have participant login to a system (for the first time) and there will be a notification about updates, have the participant then install the updates
# Recruit Participants: Grab some people at your office, or get some people to volunteer online
# Run the Usability Test: Run the tests, and watch the videos with your colleagues


== Documents to Create ==
== Documents to Create ==
* Waiver for participants - an outline for a waiver
* Waiver for participants - an outline for a waiver
* Guidelines for creating a usability test
* Guidelines for testers (person running the test)
* Guidelines for participants (person performing the test)
* Guidelines for participants (person performing the test)
* An example test scenario
* Example test results (a video, some "conclusions" from the test video)
* Example test results (a video, some "conclusions" from the test video)
** http://betterdesktop.org/ has plenty of example data.
** http://betterdesktop.org/ has plenty of example data.
== About participants ==
=== Types of Participants ===
In order to consolidate our efforts, each participant will self-apply one of these terms in order for testers to correctly test the intended audience for an application.
* general user
* power user
* sysadmin
=== Other information to gather on participants ===
* Primary OS use (Mac, Linux, KDE, Windows, etc.)
* Age
* Gender
* Type of computer user (new to computers, regular, experienced, hacker)
* Experience with Linux (first-time user, casual, experienced)
* Experience with Fedora (first-time user, casual, experienced, dev 
team)
If not testing at FUDCon, then get signups, with designation of power 
user, sys admin, or general user


== Things to test ==  
== Things to test ==  
Line 49: Line 54:
* Software Updater
* Software Updater


Authors/maintainers of applications may sign up to volunteer for  
Authors/maintainers of applications may sign up to volunteer   
their application to be tested - they will need to include the  
their application to be tested - they will need to include this  
information on the application, such as primary function and target 
information on the application:
participants (general users, power users, sys admins).  They can also
* Intended use
propose a test scenarios.
* Intended Audience (general users, power users, sys admins)
* (Optional) can also propose test scenarios


''Avoid testing things that are not going to change or we are not capable of changing.''
''Avoid testing things that are not going to change or we are not capable of changing.''


== Creating a Usability Test ==
== About participants ==


=== Basic Order ===
=== Types of Participants ===
# '''Application:''' Choose an application/website you want to test
In order to consolidate our efforts, each participant will self-apply one of these terms in order for testers to correctly test the intended audience for an application.
# '''Intended Use:''' Determine the intended use of this application/website
* '''General User:''' Daily to semi-daily computer user, may dabble in writing code
# '''Intended Audience:''' Determine the intended audience for the application (general user, power user, sys admin, other)
* '''Power User:''' Constant computer user, has written code
# '''Create a Test Scenario:''' Create a test scenario for the participant to follow
* '''Sysadmin:''' A Systems Administrator
# '''Recruit participants:''' Five is a good number.  You can strong-arm friends and family, recruit online, or just get some co-workers on a slow afternoon
# '''Run the Usability Test:''' Yes, do that.


EXAMPLE:
=== Participant Signup Form Info ===
# Application: Software Updater
* What kind of overall computer user are you? (general user, power user, sysadmin)
# Intended Use: Updating the system, which involves downloading and installing new packages
* Primary OS use (Mac, Linux, KDE, Windows, etc.)
# Intended Audience: This applies to all users, so general users are the participant choice
* Age
# Test Scenario
* Gender
 
* Experience with Linux (first-time user, casual, experienced)
=== Making the Signup form ===
* Experience with Fedora (first-time user, casual, experienced, dev team)


TODO.
If not testing at FUDCon, then get participation volunteers, including this information.
 
It's easier if there is a paper form, as well as an online form for participant signups.
== Recording the Usability Testing ==
Most usability tests need to be recorded so the tester can show the test to other designers, or review the test later to make sure their information is sound.  What you want to record during a test is the screen/monitor and the participant and tester's voices.
 
=== Different Methods of Recording Usability Test ===
* Videocamera
* Screen Capture & Audio (e.g. with {{package|istanbul}})
 
== Usability Testing Terms and Protocol ==
* '''Participant''' - the person taking the usability test
* '''Tester''' - the person running the usability test
* '''Thinking Aloud''' - The participant needs to talk through what s/he is doing for the benefit of the recording.
* '''Active Intervention'''  - The tester asks questions, such as "What would you do next?" and "What do you think about the menu structure here?"
* '''Remote Testing ''' - Usability testing done over a long-distance, where the tester and the participant are not in the same room, but communicate via some audio device


== Writing a Test Scenario ==
== Writing a Test Scenario ==
Line 111: Line 102:
choice or options to choose from.  Of course, for some tasks, this   
choice or options to choose from.  Of course, for some tasks, this   
isn't possible, but it's fun to try.
isn't possible, but it's fun to try.
== Recording the Usability Testing ==
Most usability tests need to be recorded so the tester can show the test to other designers, or review the test later to make sure their information is sound.  What you want to record during a test is the screen/monitor and the participant and tester's voices.
=== Different Methods of Recording Usability Test ===
* Videocamera
* Screen Capture & Audio (e.g. with {{package|istanbul}})


== What to do as a Tester ==
== What to do as a Tester ==
Line 118: Line 116:
** This should also be the mindset of the persons evaluating the usability tests.
** This should also be the mindset of the persons evaluating the usability tests.
* Encourage the participant to "think aloud" throughout the test.
* Encourage the participant to "think aloud" throughout the test.
** Sometimes this can be a bit hard, as it's not something people are used to doing.  This means ncouraging them to tell you what they're planning to do, why they make the decisions they make, and what they expect to happen when they make a choice. Also, a tester should note a participants' confusion or surprise and the reasons for it.
** Sometimes this can be a bit hard, as it's not something people are used to doing.   
** This means encouraging them to tell you:
*** What they're planning to do
*** Why they make the decisions they make
*** What they expect to happen when they make a choice
* You ''CAN NOT'' point out where things are or what is happening to the participant - these things should be made clear by the application, and if they aren't, then confusion will be the result of the lone user experience (which is what we're attempting to emulate).
* You ''CAN NOT'' point out where things are or what is happening to the participant - these things should be made clear by the application, and if they aren't, then confusion will be the result of the lone user experience (which is what we're attempting to emulate).
* If a participant is not actually "thinking aloud" very much, ask questions that help get a peek into what they're thinking - for example, if a user hits a button and then says "oh!" or "What?" then ask them what they expected to happen versus what happened.   
* If a participant is not actually "thinking aloud" very much, ask questions that help get a peek into what they're thinking - for example, if a user hits a button and then says "oh!" or "What?" then ask them what they expected to happen versus what happened.  A tester should note a participants' confusion or surprise and the reasons for it.
* Avoid asking the participant leading questions.   
* Avoid asking the participant leading questions or giving them indication that they are way off courseA leading question can skew the participant's actions.  Try to be more subjective and ask things like "Which of these would you click/select and why?"
** For example, if a participant is about to go off in the wrong direction, sometimes a testers will ask them "Why are you going to click that?" - a leading question can skew the participant's actions.  Try to be more subjective and ask things like "Which of these would you click/select and why?"
* There are times when things have gone so far afield, that you may have to start over.  Again, this is not the participant's fault, or yours, or even necessarily the application's.  After a usability test has gone awry, usually you can make adjustments to the test scenario to compensate for some of the problem.


=== After a test ===
=== After a test ===
* Thank the tester
* Thank the tester
* Summarize the participants experience in notes
* Summarize the participants experience in notes

Latest revision as of 04:09, 8 January 2009

Note.png
Work in Progress
These are Leah's notes on Usability Testing. This is a work in progress and suggestions are welcome. Please use the discussion link above.

Why Usability Testing?

Compelling intro paragraph here.

Usability Testing Terms

  • Participant - the person taking the usability test
  • Tester - the person running the usability test
  • Thinking Aloud - the participant talks through what s/he is doing for the benefit of the recording
  • Active Intervention - the tester asks questions, such as "What would you do next?" and "What do you think about the menu structure here?"
  • Remote Testing - usability testing done over a long-distance, where the tester and the participant are not in the same room, but communicate via some audio device
  • Test Scenario - a written script of what task the participant is to perform during the usability test, should encompass the intended use of the application

What you need to do Usability Testing

  • A tester (to run the test)
    • Two Copies of the Consent Form (one for tester, one for participant)
      • this might be covered by the CLA - check with legal?
    • Signup Form
    • A written Test Scenario
    • Paper/laptop for Notes
  • A computer (with the application/website you're testing)
  • A participant
  • A videocamera or screen/audio capture program
  • A room where you won't be interrupted during the test

Creating a Usability Test

Basic Order

  1. Choose Application/Website: Choose an application/website you want to test
  2. Intended Use: Determine the intended use of this application/website
  3. Intended Audience: Determine the intended audience for the application (general user, power user, sys admin, other)
  4. Create a Test Scenario: Create a test scenario for the participant to follow
  5. Recruit Participants: Five is a good number. You can strong-arm friends and family, recruit online, or just get some co-workers on a slow afternoon
  6. Run the Usability Test: Yes, do that.

EXAMPLE:

  1. Application: Software Updater
  2. Intended Use: Updating the system, which involves downloading and installing new packages
  3. Intended Audience: This applies to all users, so general users are the participant choice
  4. Test Scenario: Have participant login to a system (for the first time) and there will be a notification about updates, have the participant then install the updates
  5. Recruit Participants: Grab some people at your office, or get some people to volunteer online
  6. Run the Usability Test: Run the tests, and watch the videos with your colleagues

Documents to Create

  • Waiver for participants - an outline for a waiver
  • Guidelines for participants (person performing the test)
  • Example test results (a video, some "conclusions" from the test video)

Things to test

Ideal things to test:

  • File Browser
  • Software Updater

Authors/maintainers of applications may sign up to volunteer their application to be tested - they will need to include this information on the application:

  • Intended use
  • Intended Audience (general users, power users, sys admins)
  • (Optional) can also propose test scenarios

Avoid testing things that are not going to change or we are not capable of changing.

About participants

Types of Participants

In order to consolidate our efforts, each participant will self-apply one of these terms in order for testers to correctly test the intended audience for an application.

  • General User: Daily to semi-daily computer user, may dabble in writing code
  • Power User: Constant computer user, has written code
  • Sysadmin: A Systems Administrator

Participant Signup Form Info

  • What kind of overall computer user are you? (general user, power user, sysadmin)
  • Primary OS use (Mac, Linux, KDE, Windows, etc.)
  • Age
  • Gender
  • Experience with Linux (first-time user, casual, experienced)
  • Experience with Fedora (first-time user, casual, experienced, dev team)

If not testing at FUDCon, then get participation volunteers, including this information. It's easier if there is a paper form, as well as an online form for participant signups.

Writing a Test Scenario

Write a scenario for the application that tests its intended use. For example, let's say we're testing the File Browser. The tester would tell the participant to perform a task that would take them through the File Browser.

Estimate the amount of time for the task - this is important in order to give the participant some idea of how long they'll be testing. Also, if the task is very short, testing it can be combined with other short tasks, in order to get good mileage out of one participant.

ALSO NOTE: After watching your first usability test with this scenario, you may need to adjust your time estimate for the completion of the tasks. This is normal.

Example scenario for the File Browser: On this computer, there is a folder titled "Cute Animals" in the users Pictures folder. Go to this folder, and pick your favorite cute animal picture, then move it to the desktop.

Notes about scenario tasks

Generally, a user will be more involved in a task if they get a choice or options to choose from. Of course, for some tasks, this isn't possible, but it's fun to try.

Recording the Usability Testing

Most usability tests need to be recorded so the tester can show the test to other designers, or review the test later to make sure their information is sound. What you want to record during a test is the screen/monitor and the participant and tester's voices.

Different Methods of Recording Usability Test

  • Videocamera
  • Screen Capture & Audio (e.g. with Package-x-generic-16.pngistanbul)

What to do as a Tester

During a test

  • Make sure the participant signs the consent form
  • Reassure the participant that this is in NO WAY a test of their skills or knowledge, and it is a test of the application itself.
    • This should also be the mindset of the persons evaluating the usability tests.
  • Encourage the participant to "think aloud" throughout the test.
    • Sometimes this can be a bit hard, as it's not something people are used to doing.
    • This means encouraging them to tell you:
      • What they're planning to do
      • Why they make the decisions they make
      • What they expect to happen when they make a choice
  • You CAN NOT point out where things are or what is happening to the participant - these things should be made clear by the application, and if they aren't, then confusion will be the result of the lone user experience (which is what we're attempting to emulate).
  • If a participant is not actually "thinking aloud" very much, ask questions that help get a peek into what they're thinking - for example, if a user hits a button and then says "oh!" or "What?" then ask them what they expected to happen versus what happened. A tester should note a participants' confusion or surprise and the reasons for it.
  • Avoid asking the participant leading questions or giving them indication that they are way off course. A leading question can skew the participant's actions. Try to be more subjective and ask things like "Which of these would you click/select and why?"
  • There are times when things have gone so far afield, that you may have to start over. Again, this is not the participant's fault, or yours, or even necessarily the application's. After a usability test has gone awry, usually you can make adjustments to the test scenario to compensate for some of the problem.

After a test

  • Thank the tester
  • Summarize the participants experience in notes