Case Study: Customer Interviews

Through customer interviews ATF learned:
1. Calling is more effective than email when recruiting participants.
2. People get testing fatigue after an hour.
3. A majority of our assumptions were false!

If your application doesn’t attract your target audience, your first question should be: why? We regularly help clients answer this question, and recently, we asked ourselves. We called up our own clients and interviewed them to uncover why business professionals do (or don’t) seek out UX design services.

We conducted our customer interviews this past October, and the insight we gained has been monumental in our current website redesign effort (launching Summer 2012), as well as our general approach to sales and marketing.

While I won’t go into depth defining a customer interview (there are plenty of resources to do that), I will talk about our process and what we learned through our research.

Step One: Assumptions, Questions and Recruiting

Before we sat down for any interviews, we had a lot of preparations to make, from compiling assumptions to gathering questions to (of course) recruiting customers!

Assumptions

Our Pre-Interview Assumptions

First we explored the assumptions (or educated guesses) we had – what we believed our clients needed, wanted, and expected from an agency.  We asked ourselves: what do clients look for when they hire a UX firm? We brainstormed via whiteboard, and after significant team discussion we were able to create a clearly defined, prioritized list of assumptions:

  1. Business Experience – ATF’s customer base puts a high priority on business experience when seeking a UX firm.
  2. User Satisfaction – ATF’s customer base associates user satisfaction with being user-centric.
  3. Frugal/On-Budget – ATF’s customer base wants to hire experts who can spend their money wisely.
  4. Artistic – ATF’s customer base wants visually appealing products & interfaces.
  5. Worry/Problem Free – ATF’s customer base wants a smooth project that is worry-free and lacking in surprises.

Questions

With the assumptions in place, we next put together a study plan to keep us focused and efficient. In the study plan, we identified the people we planned to interview, the questions we wanted answered, and our methods for collecting and analyzing the data.

ATF Ethnographic Interviews Study Plan

Questions from the Study Plan

Next we turned some questions into tasks, to allow us to observe the customer’s behavior. For example:

Original Question: What is your process for hiring a new vendor?

Question as a Task:  You have an existing application but no one who signs up is using it, and you are not getting any feedback. You need to find out why users aren’t using your application and update it to make it relevant. Your in-house design team is at capacity and they will not be available to assist you in this endeavor. How do you go about solving this problem?

Whether clients began a Google search or told stories of past hires, observing the client gave us new insight into their decision-making processes.

Recruiting

Customer interviews are also known as: contextual inquiry, work study, ethnographic research, field research and field studies. Customer interviews are not to be confused with Focus Groups.

For this exercise we recruited both prospective and existing clients. We chose specific types of people based on our general customer base: primarily marketing or technology execs and product managers. From our experience we have compiled some “lessons learned” for the future.

  1. Phone calls are superior to email for scheduling purposes. Prospective interviewees told us they were interested via email, but never answered our follow up emails to schedule the date. Phone calls are direct and efficient, and clients appreciated the personalization.
  2. People want to help. Very few asked for any form of compensation – the real concern people had was, “how much time will this take?”

Step Two: the Interview

Between two and four ATF team members traveled together to each client’s office to conduct and observe the interviews. The minimum of two ensured there was always one facilitator (Heather) and one note taker (me).

The amount we learned over the course of the interviews cannot be summed up easily, but here are a few tips we can offer.

  1. We chose Silverback on our own laptops to record the users’ faces and their interactions with the computer. Great decision! Facial cues are indispensible.
  2. Each test consisted of one and a half hours in front of the computer and an additional 30 minute card sorting exercise. As it turns out, 2 hours is too long. Our users got testing fatigue around the 90 minute mark – for future tests, we’ll keep tests under an hour and break up the different research activities into unique sessions.
  3. Watch for actions. Don’t listen for opinions. We heard plenty of opinions and critiques, but many people act contrary to what they say. For example, the video below shows an example of actions contradicting words.  Upon first glance of a home page the user states: “I wouldn’t contact these guys,” but instead of leaving, he continued to interact with the site, and changed his mind! We can’t emphasize it enough: When conducting user research, watch what users do, not what they say.

Ethnographic Interview Clip – Michael Bourque from Joe Baz on Vimeo.

Step Three: The Findings

As much as we learned during testing, we learned even more when we reviewed the findings. Our goal was to convert our assumptions into facts, or disprove them entirely, and we succeeded. Here’s a subset of what we discovered:

Assumption True or False Finding
Clients are interested in seeing a well-established agency. False Clients were repeatedly turned off by the “big” guys. Laundry lists of portfolio items, huge teams, and particularly teams with lots of middle managers were a no-go.
Clients are looking to understand the agencies process to see if it fits with theirs. True & False When evaluating an agency website, clients look for a “result”. They want best practices and a fresh eye on their product. When evaluating an agency proposal, they are interested in seeing the agency’s process.
Clients are looking to get a lot of information from an agency’s website. False Clients reacted more favorably to sites that had very little text and more visuals or infographics. The sites that tried to explain every detail of the business were poorly received.

Thanks to these findings we are now improving not only our website communications, but our proposals, pitches and general correspondence with prospective clients.

Takeaways

Customer interviews are valuable not just for the information garnered in the interview, but for the entire process. Through customer interviews you will learn more about your product and yourselves.

  • Creating assumptions will bring your team together, and help everyone to compare their understanding of the end user.
  • Turning questions into tasks will provide insight on the user’s daily tasks. The more tasks your team creates, the more questions will appear about how the user makes decisions or utilizes your product.
  • Recruiting is a lesson in communication. Learning what your users value and speaking to them in person will help you to speak their language when updating your application.
  • Interviews teach everyone involved. Every observer will come away with takeaways and lessons learned, and remember: watch what users do, not what they say!
  • Compare your findings to your assumptions and create a record to use in the future. The best part of conducting user interviews is learning from them, to make future applications, products, and websites ever better