Using cog walks to supercharge your UX research
Last week we held a UX research session on Project Ara, the modular smartphone coming from Google ATAP, Advanced Technology and Projects. I asked for help with usability research from Dhvani Patel, Senior UX Researcher in Google X, and she suggested conducting a Cognitive Walkthrough (or “cog walk” for short) with four senior members of the Google X design team.
This was my first time taking part in a cog walk, and I was impressed by how helpful it was — it was like UX research on steroids.
The cognitive walkthrough is a usability inspection method where a group of experts step through tasks in a system and identify problems along the way. It’s like a one-on-one usability session, but instead of recruiting potential users, you’re using UX designers and researchers (who are new to the project) as your guinea pigs. And instead of conducting a series of time consuming one-at-a-time sessions, you get the feedback from the whole group all at once. On the other end of the spectrum, the cog walk is also similar to a design critique, but in this case the reviewers are evaluating the work from a user task point of view, rather than stepping back to look at the whole system. And unlike in a typical review, designers aren’t allowed to defend or explain their decisions. They have to sit quietly and accept the feedback as it comes.
We used Google Slides as the tool for our walkthrough. In advance of the in-person session, I built out a slide deck. It included some ground rules for the cog walk and context on Project Ara, then three “golden path” user flows around the Ara experience. Slides was a helpful tool because I could share the work in progress with the product manager and researcher, refining the presentation for the meeting based on their comments. For the first two user flows, I stepped through high-fidelity mocks in the chrome of the mobile device. For the third flow, a flow that also included more of the physical interaction with the Ara phone, I used a combination of hand-drawn storyboard sketches and UI wireframes superimposed into the drawings.
Ravi Shah, the PM, and I met the participants for the first time as they entered the room, and we didn’t tell them our roles on the project. Dhvani coached us that sometimes participants don’t want to speak up too critically if they know the designer or product manager are in the room, so we kept the introductions minimal.
At each step of the flow, the group was asked:
- Is that what you expected?
- What would you do next?
- How did you know to do that?
The participants were exceptionally blunt and straightforward in their feedback, which was equal parts refreshing and hard to hear. We were getting first-time reactions to our flows, and I was surprised by how much they expressed emotional responses — feeling overwhelmed, or confused, or inadequate, or annoyed by the experiences in the slides.
One great thing about having a group of designers in the room is that they are familiar with the process of product design and development — and they know how to read wireframes. They’re not distracted by the fidelity of mocks they’re looking at. Unlike in one-on-one tests with recruited users, they are not shy or afraid to speak up. You can trust that what they’re saying is legitimately to help improve the product (as they’re not collecting a check).
We had someone taking notes live, in the slides themselves, during the session. Afterwards, the researchers went through the comments and put a summary of the feedback onto the slides themselves. In that way, the slide deck becomes a record of the session and a working document with advice for improving the product.
I was impressed by how efficient the cog walk can be to get a ton of actionable feedback in a concentrated of time. It was a sanity check. Ravi remarked: “It was a really good way to get honest feedback on the experience without everyone commenting on the design elements themselves.”
Cog walks are best for “new user” experiences; they’re not as effective if you’re evaluating other types of flows, say advanced tools that require a lot of context. Also, the success of the session will depend on the group and the expertise of your participants. I’d say that in that respect, we were very lucky. We received a wealth of feedback that could have taken a good full day of one-on-one sessions — in one short hour.