Test analyzer/report writer checklist.
Pre-test.
- Know and understand the objectives of the test.
- Know and understand the measures of the test.
Post-test.
- Compile.
- Combine facilitator's, coordinator's, and data logger/note taker's information into the same form. A spread sheet may be useful.
- Separate performance data and preference data.
- Organize preference data into separate aspect type (e.g. frustration, confusion, satisfation).
- Group specific preference data by task and then by relevant web page to comparison of different participants on the same web site.
- Group questions and answers together with the corresponding data category.
- Analyze.
- Calculate the user profile frequency distribution.
See an example user profile frequency distribution. - Compare the test measures to the test outcomes.
See an example task quantitative criteria form.
See an example performance data summary form.
- Determine why tasks caused problems and rank them. (Error source analysis)
- Mark out those tasks that caused poor results against the criteria for successful task completion.
- Attribute where ever possible the source of every error/problem in those tasks to the responsible component or combination thereof in the web site. It is essential here to be absolutely sure about why the problem arose.
- Review the videos and transcripts where necessary.
- Transfer the focus from the task to the web site.
- Use care in defining metrics for ranking.
- Devise criteria for a 4 point severity rating. See findings info below.
- Calculate the user profile frequency distribution.
- Recommend.
- Work with the team to come up with useful recommendations for changes to the web site based on the cause of the errors.
- Take a reasonable amount to time and compile results. Be sure to
use lots of actual quotes and real numbers - 4 out of 7 participants
did.... using statements like most, few, or some makes it hard for
folks to feel persuaded by the results.
- Keep test objectives in mind.
- Consider global changes before local ones. (Don't loose sight of the forest for the trees)
- Remain objective, your only concern is the user.
- Report.
- Write the final report so that it is useful for decision makers responsible for the web site (CWC).
- The aim is to initiate change, direct action, educate and document. (In that order.)
- Include key quotations.
- Include mean and range values for Likert scale ratings on the questionnaires.
- Include a summary of post-test comments.
- No industry standard report format. Generally report will take
form:
- Executive summary.
- Methodology.
- User profile.
- Evaluation measures.
- Findings: Each finding is classified
as either
- A catastrophic problem.
Imperative to fix.
= 4 on a 0-4 severity scale. - A serious problem.
Important to fix.
= 3 on a 0-4 severity scale. - A minor problem.
Fixing this should be given a low priority.
= 2 on a 0-4 severity scale. - A suggestion for improvement.
Cosmetic problem only. If when time permits.
= 2 on a 0-4 severity scale. - No problem.
= 0 on a 0-4 severity scale. - Positive finding.
- A catastrophic problem.
- Appendix: Task list.
- Appendix: Interview questions.