Michael DeVries

Sr. User Experience Researcher and Analyst

| Portfolio Highlights
| Heuristic Evaluations | Measuring Experience | Leading UX | Usability Testing
Significant differences in preference measures
Usability Testing in the Lab: Multiple prototypes of a Redesigned SprintPCS.com Home Page

Overview

The goal of the study was to perform usability testing on several designs of the SprintPCS.com Home page. Both objective and subjective measures would be assessed. A comparative evaluation was performed on four web site designs, with a slight variation of the fourth design included as well. The prototypes A, B, C, and D below are not provided in any particular order, and D2 is not shown.

Four Sprint.com Home Page Designs




Research Design

  • Conduct Usability Study (40 participants)
  • Data collection in 3 cities: DC, Dallas, San Diego
  • Measures included:
    • Can the user match a task to the target location? (between subjects)
    • Subjective rating of usability completed after each task (between subjects)
    • Forced-choice preference ranking of all prototypes (within subjects and collected last)
  • Two focus groups (16 users), conducted in DC and Dallas by an external research consultant
  • Heuristic analysis. The prototypes were evaluated by 3 expert evaluators using Nielsen's home page design guidelines (113 total guidelines used)

Accomplishments

  • The findings, in order of importance, are performance (task pass/fail), heuristic evaluation (consensus of review by three experts), and then preference. Preference ratings included site rankings, appeal of the aesthetics of each site, and a "sticker" exercise not described here.
Home Page Study Results
The key finding was that performance was found to be equal, leaving the heuristic review to drop web site A from the final competition, but allowing other influences from the site to impact the final UI design.

Building on That

Usability practitioners understand well that how a user performs on a task can be very different than how they feel about a task. As the Marketing group was a key stakeholder in this project, I brought the results to them in the form of a presentation on the difference between performance and preference. In several pages, I put different, colored pictures of the Charlie Brown Snoopy character (by Charles M. Schulz), clearly visible amongst the data and visuals. On one page, I intentionally placed a very small black and white Snoopy. As the presentation went on (large screen projection) the person that called out "Snoopy" first when they saw one was given a candy prize. Once we hit the page with the small, unobtrusive Snoopy, the room was quiet. Just before I was to change the slide, one individual finally exclaimed "Snoopy!" They were given the prize to the astonishment of everyone else in the room. At the end of the presentation, I provided them with the chart seen below. Given a large group, the expectation is that there will be more evenly distributed preference rankings. In the performance column, the expectation is that there will be many fewer that saw the very small, well hidden black and white snoopy. The closing line went something like, "So while you look at a user interface and judge it quickly based upon whether you like it or not aesthetically, how well a customer performs at a site may be quite a different story."

I've added simulated data to the table below. Feel free to use it for your own presentation where this might come in handy. Please be sure to include the credit for Mr. Schulz.

How many people like:
(Preference gives us a rank order)
How many people saw:
(Performance gives us task pass/failure)
5 25
6 27
4 3
5 27
7 27
Snoopy images belong to Charles M. Schulz


And Did You Experience the Same?

The impact of the UI design on this page may have had the same affect on you. Note the table below.

Which do you like best? Did you honestly see them all?

Which of these do you know you remember seeing? The same tactic as in the Snoopy presentation was used on this page. Which of the items do you prefer? While you may not have preferred the black one, you may not have known it was on the page at all. If you missed it, it was because it was designed to be that way. So, preference is not performance, and you just experienced it.


Skills

  • Experimental design
  • Vendor management
  • Questionnaire design
  • Screener design
  • Heuristic Evaluation
  • Multiple data integration
  • Usability testing
  • Helping others to understand usability

Back to Portfolio Highlights
More...
Usability Testing

Copyright 2010 Michael DeVries