User Testing

Nov 07, 2006

Publishing your content is not the last stage in the content lifecycle. For user-centered designers, in many ways it is the beginning of great content and services.

Top-notch website builders make small incremental changes in their content and interactive services and then do extensive testing to see if the "agile" design change has made an improvement. The largest sites, Amazon for example, do their incremental changes in an "A-B test." For a small fraction of site visitors, a "B" version of the page is served. Careful comparison of the "B" results may reveal an improvement, but more likely a deterioration in the usability or what is called the "user experience." Of hundreds of such changes made at Amazon per month, perhaps only one or two are good enough to be made permanent.

This is like Darwinian "survival of the fittest," with user testing the method of natural selection.

At this year's User Interface 11 conference, leading usability expert Jared Spool told some horror stories of great commercial websites that completely made over their sites without adequate user testing in the pre-launch phase. One giant online vendor of computer systems saw a 10% drop in sales, a loss one hundred times the cost of the site redesign and many times the cost of proper user testing beforehand.

Not that professional user testing is cheap. Jared's firm consults with clients who spend hundreds of thousands of dollars in user testing because they know that a one percent improvement can add millions of dollars to their sales.

Invest in the User
So what's a smaller website designer on a minimal budget to do? I've experimented with a small suite of free or low-cost tools to test your user reactions to your content wherever they are in the world. I find remote user testing has a powerful effect because it shows designers and programmers the problems that real users are having navigating a site and finding what they are looking for. It gets the user back into the center of user-centered design (UCD).

My main tool is a free screen-sharing client/server product called Real VNC. You can use commercial products like WebEx, Microsoft Live Meeting, and Macromedia Breeze (now renamed Adobe Connect) but these can get very expensive and they often run only on IE Real VNC has multi-browser versions and supports Macintosh too. I also have used eBLVD, a low-cost commercial screen-share (for IE only) that gets you 6 people on the same page for just $30/month. Professional services firms like Premiere Global resell many of these tools and can help you get the right one.

Professional user testing includes recording each testing session. This makes it reviewable later if a key designer or programmer could not attend the session. You can do this for a modest investment ($299) with Camtasia from TechSmith. It's the same tool that top usability professionals use as part of TechSmith's Morae Usability Testing product ($1298).

Screen sharing alone does not let you see the user squirming and sweating under requests to find specific content on your site. But for under $20, including shipping, you can equip your remote user with a webcam and Skype account. Presto, their wrinkled brow is visible on your screen as they just cannot find the link to "member benefits" or whatever because of "navigation blindness." I send my remote users the Ezonics iContact webcam ($13.99 plus $4 shipping from NewEgg).

I will use this modest web camera (and a screen share program) to keynote the Framemaker Chautauqua 2006 conference in Austin Texas from the comfort of my lab in Cambridge, Massachusetts.

Invest More than Money
Now that you have some remote testing tools, what is the testing method? There are some great books that will tell you, like User Task Analysis and Design by JoAnn Hackos and AboutFace 2.0 by Alan Cooper and Robert Reimann, but I like to boil it down to a list of simple scenarios.

On the CM Pros Scenarios page, we offer a list of tasks, some for members and some for site visitors who are potential members. For example, we might ask a visitor to find a book by Ann Rockley in our Resource Library, post a job listing to the Job Board, and search for member podcasts using the A-Z Site index. We might ask a member to edit their personal information in the member directory, unlock member-only content like a best practice, or retrieve a forgotten password.

We also build a list of Personas for our website design efforts, and we try to select testers who are representative of various user types. One persona might be a Macintosh user, another a blind user with a screen reader. Of course the matrix of different users (personas) testing different tasks (scenarios) can become complex. Your budget and/or available cycles of personal time will keep this manageable.

For each task, we watch the user moving the mouse around, carefully note missteps as the user clicks in wrong menus, and note the time to completion, if they succeed! We usually learn much more from those who do not succeed. Every problem with your content that shows up in user testing is one that your real audience and customer base will never experience.