The idea is for the students to compare their client’s product to its competitors, and in the process discover what works (and doesn’t work) with their website. They’ll then couple this with the observations they made during during the previous week’s field trips to arrive at concrete recommendations.
When I put together this week’s lecture, I realized that I couldn’t pinpoint where I exactly learned how to do usability testing. Certainly it was discussed in my schooling–at least the principle of it–but I don’t recall any hands-on practice. Most likely, I’ve sponged bits and pieces over the years from talented colleagues who hail from Human-Computer Interaction backgrounds. And over time a lot of it has become second nature to me. When I stopped to dissect it all, it’s actually a very complex process and in some cases probably worthy of having someone overseeing all the details!
As I reviewed the student’s test plans and observed (and in some cases participated in) their dry runs, I realized how much of it is a nimble balance of hopping between the scripted and the improv. And, of course, the very careful way you must proceed with phrasing and questions to keep everything as neutral, open-ended and natural as possible. Many students commented how much harder it was than they expected to nail the tasks, the timing and especially the phrasing of the conversations.
Here’s an excerpt on the Usability Testing tips that I compiled for the class:
In a strange coincidence, I was cleaning out some old files that were lurking about on my server and came across one of my “UX resources” files from 10+ years ago, which had a usability pdf I must have found on the Information & Design sitea long ago (thanks guys for having such helpful resources up even back then!). Some of the tips I’ve included date back to this document.