I never watched the television series MacGyver, but I understand why it lasted 12 seasons across its 1980s and 2010s vintages. Its ingenious star could transform household objects – think duct tape, newspaper, alarm clocks – into contraptions that could help him escape from harm or inflict damage on the bad guys.
Having worked in tech marketing for over 20 years, including nearly six years for startups, I’ve had my fair share of projects that have required making do with limited resources. That includes gathering and analyzing customer insights to inform product roadmap decisions and go-to-market strategy and execution when I either couldn’t afford quantitative research or was denied access to existing customers.
Several years ago, my company decided to launch an extension of our flagship product. It would cater to small-to-midsize businesses. My CPO (Chief Product Officer) and I recommended talking to prospective customers so we could determine what to build and how to market it. We didn’t have the time or budget to conduct quantitative surveys, so we decided to create an interview guide that we could use to generate qualitative, directional feedback.
There was a problem with this plan, however. Our company didn’t have a single SMB customer. Over the years, I’ve learned how to persuade sellers, account managers, and customer success teams to let me contact existing users, but that skill had no value in the face of this project.
Instead, the CPO and I had to Macgyver the situation.
Finding and contacting the right people
Again, my firm didn’t have any of the customers we needed to engage. Our investors, however, had portfolio companies that were filled with exactly the right people. The first thing we did, therefore, was draft an email and talking points for our CEO so he could ask our VCs if they could introduce us to those companies and to other investors they know.
It has been a few years since I have reflected on the project, but here’s what I remember about the results:
- VC firms contacted: 20-30
- Warm introductions to customers: 40-50
- Interviews held: Over 30
- Average interview length: About 30 minutes
- Total time required to complete the project: About five weeks
Constructing the interview guide
As soon as we started scheduling time with these prospective users, I built an interview guide. Not to be confused with a script, the guide helped structure the conversations we would have with prospective users. We knew going in that we wouldn’t have time to ask every question on it. Depending on how the discussions went, though, we felt confident that the guide would help us adapt on the fly to get the most out of the precious time we had.
We designed the interview guide to start every conversation with an icebreaker. We knew that in order to get the quality of information we needed, we first had to establish a meaningful connection with each person we got on the phone.
Once we had broken the ice, the guide produced three different types of insights: jobs to be done (JTBD), user pains, and hoped-for gains, both practical and emotional.
Here are some of the questions we asked after our icebreakers, which typically involved things such as talking about shared network contacts.
- What’s your current job title?
- How many years have you had in this job function? This would help detect differences among respondents depending on their experience level.
- What goals are you trying to achieve?
We often would circle back to this question as the discussion unfolded. That’s because most respondents answered this question in the context of their work objectives. Over time, we’d occasionally spot more intangible goals that arguably were just as important, such as the desire to get home in time to have dinner with their children or the opportunity to play in their company’s intramural volleyball league.
- How do you measure success?
- Do you influence or make the decision to buy the products in our category?
- Do you use these products?
- What jobs do you complete in pursuit of the goals you’re tasked with achieving?
- Which tools, if any, do you use to complete those jobs?
- What works best about using those tools?
Here, we were looking for things like improved results or efficiency, which would include time or money required to complete the work. We also wanted to know how they felt about a job well done.
- What do you find frustrating about these tools or the ways in which you complete the work that’s assigned to you?
Here once again we listened for both practical answers – e.g., “I find the interface to be confusing at times and require a long learning curve” – as well as responses that were more emotive – e.g., “I don’t like asking for help.”
- How much experience do you have completing these jobs?
This is like the question about job experience, but we realized we needed to ask it, too, to determine how much time they had spent working on the specific projects that would have required that they use products like the one we were considering.
The results: making the new feel more familiar
The information we gathered and synthesized allowed us to design a better product and execute a more customer-centric go-to-market plan. We determined that we needed to design our product’s UX (user experience) in a way that would mirror similar products that our audience knew well. That represented a departure from the way our incumbent product operated, which suggests that we may not have made this design decision absent the research.
We also learned that users craved ease of use but were wary of products they considered to be too easy. That prompted us to offer more in-app guidance and prompts, which were designed to instruct and offer assurance throughout the customer’s journey in the product.
The results helped us refine our product positioning and marketing strategy, as well. Just as we designed the product’s UX to resemble products the audience already knew, we borrowed messaging language that those products use in their marketing. That resulted in a perceived level of familiarity that helped generate a healthy lead pipeline and accelerate adoption.
If I had to run the project over again, I would have wanted enough time to run both quantitative and qualitative user research. The two of them go together like chocolate and peanut butter: two good things that get even better when they’re paired. In this case, though, the qualitative research we conducted enabled us to move faster and build smarter.
I like to think Macgyver would have approved.