Learn how to design and launch surveys that ask the right questions at the right time and provide a steady stream of actionable qualitative insights.
We’re living in the age of the customer, where individual software users drive product decisions and market tactics. The trouble is, many software companies don’t know their customers—not really.
A huge reason for this is companies aren’t talking with their customers. A Hubspot analysis found 42% of companies don’t survey or ask for feedback. And 81% of companies don’t have a formal customer advocacy program. Even among companies who do talk with customers, data from User Interview suggests over 60% of stakeholders don’t know how to access customer research findings.
With findings like these, it’s no surprise teams struggle to put the customer first.
“Trying to stuff a product down the throat of an unsuspecting bystander is a good way to build the wrong thing.”
— Drew Houston, founder and CEO of Dropbox
Customer-driven is easy to say and hard to do. In truth, most teams run on hunches, directives from leaderships, and a good deal of Starbucks—not the customer. These teams reorganize every quarter and are tasked with new features that add theoretical value.
But data indicates these hunch-based features aren’t as valuable as teams think. When ProftWell asked 2,500 product leaders to assess the last 5,000 features they’d built, leaders put most features in the high-value and high-willingness to pay bucket. Yet when ProfitWell asked 1.2 million customers to assess those same features, customers put them in the opposite bucket—low value and low-willingness-to-pay. What ProfitWell considers “trash land.”
“Overall, if you want to deliver an AMAZING customer experience, the SINGLE MOST IMPORTANT thing you can do is LEARN more about your customers so you can custom tailor that experience to them. It's not magic. It's not science. It is simply building a tighter relationship with your customer.”
Eric Carlson - Founder, 10XFactory
To be fair, many teams want to run on customer insights, not hunches. They’ve seen the success of product-led companies like Shopify, Twilio, and Atlassian. (Product-led companies like these have over 2x enterprise value, over 1.5x revenue, and over 9% higher revenue growth than other SaaS players.) SaaS teams know customer understanding is the key to growth. It’s figuring out how to use the key that’s so difficult.
There are two sides to the customer understanding coin
When it comes to customer insights, there are two types of useful data: quantitative and qualitative.
Quantitative data provides important context on what customers are doing and how much. Quantitative data is fixed and measurable. It provides valuable information in contexts like A/B tests and market research.
Quantitative data is incredibly useful, but it has at least one major shortcoming: For all its statistical significance, it cannot help you understand why customers are taking certain actions. For that insight, you need qualitative data. Qualitative data illuminates the why behind the what.
This why information is incredibly important. It allows teams to move from reactive to proactive—from what customers are doing this week to what they’ll adopt in the future. Amy O’Callaghan is a product manager at Snagajob who has practiced customer discovery for over a year. She’s experienced the reactive-to-proactive shift and explains, “I feel confident pushing for things that we’ve prioritized because we know they will bring value—we aren’t taking nearly as many risks with development time as we used to, and the dev team appreciates that.”
To many product managers (not to mention marketers), that level of confidence sounds magical. But obtaining this confidence isn’t magic or even a well-hidden secret. In fact, many teams could start gathering information from their customers today using the simplest of tools: the survey.
Surveys aren’t a revelational idea. The earliest ones date back to at least the Roman era, and they’ve been widely used in the US since the 1930s. You probably have one sitting in your inbox right now.
For many teams wanting to collect more qualitative data from existing customers, they’re a great starting point. Chances are, your team already uses a tool (such as Hubspot, Typeform, or Intercom) with survey capabilities built-in. Surveys are also lower cost and lower effort than more robust customer research options like interviews and focus groups.
There are many types of surveys, but in this course, we want to walk you through creating one specific type: the qualitative survey. This type of survey targets a specific group of people with specific feedback questions. Oftentimes, those questions are open-ended. The goal of a qualitative survey is to generate insights that help solve business-critical problems.
Other qualitative survey benefits include:
The rest of this course walks you through how to craft a qualitative survey so you can get these benefits. By the end of the following three lessons, you’ll have a firm understanding of who to survey, what questions you should ask them, and which channel you should use.
One enduring myth about surveys is this: they’re a set of semi-random questions sent out to a very random group of people. A good qualitative survey is neither of these things.
“A problem: even the best personas tend to be descriptive, but not predictive.”
The idea behind personas is a good one. They’re supposed to humanize software users and equip teams to make better decisions. However, the problem with most personas is they’re largely made up. It’s easy to pull a variety of data from sources like Google Analytics to make a semi-educated guess most of your customers are “Jill,” a late-20s, iPhone-wielding, trend-setting millennial who lives in the midwest.
While these visualizations may look great, they bring little predictive power to the table. Will Jill adopt the latest feature? Maybe—she’s trendy, after all—but we can’t really know. Because while personas may tell you what customers look like, they don’t tell you why customers buy or what is going on in their lives when they do. Personas fail to tell you whether Jill even needs that feature to begin with.
To truly understand customers’ behavior, you need to look at them from another angle.
When Clayton Christensen popularized the Jobs to be Done (JTBD) framework, he gave teams a powerful new way to view their customers.
While there’s plenty of nuance to the JTBD framework (we cover that in another resource), the gist of it is this: Customers don’t buy your product. They hire it to do a specific job.
A job, in this context, is a specific type of progress your customer wants to make. To find that job, you have to understand what your customers’ lives look like. You also need to know what pains and frustrations they experience, alternative solutions they considered, and so on. This is the kind of information that helps teams make smart moves around marketing, acquisition, retention, and churn.
One way Typeform leverages customer understanding is through their quarterly “customer voice” report. The customer experience (CX) team generates this report by pulling data from support tickets, churn surveys, sales calls, and other touchpoints. In one instance, the CX team figured out the majority of Typeform’s churn is due to one thing: Customers don’t know what to do after they create their first form. This insight helped the Typeform team combat churn with the “What’s your next Typeform?” campaign. This campaign appears to customers who have completed a form, and it delivers inspirational content around creating more forms.
“Through our churn survey, we found that a lot of our churn isn’t actually due to customers being unhappy, but rather from people successfully completing a project and not knowing what to do next. As a result, beyond the typical ‘feature-based’ content one would expect in a help center, our education team also creates content that is ‘job-to-be-done-based’ in order to inspire customers to do more with Typeform than they had initially intended.”
– David Apple, VP of Customer Success and Sales at Typeform (former)
Looking at the customer journey as a whole is important. But in this course, we want to hone in on what you can learn from the journey stages that intersect your product.
From website visitors to recent cancellations, here are the lifecycle points that are particularly useful for building a JTBD understanding.
This is a customer who’s browsing your website or landing page, but hasn’t committed to your product. These customers can help you understand where your positioning does (and doesn’t) resonate and what information about your product isn’t clear.
This lifecycle point is especially useful for marketers. Insights from website visitors can help marketers:
This is a customer who recently signed up for your product or service. They can help you understand what the customer is:
Maggie Crowley, Director of Product Management at Drift, asks new customers questions like, “What are your big goals for this year?” and “What kind of conversion rates are you trying to hit?” Because if you know what your customers are trying to achieve, you’re one step closer to making them superheroes with your product.
This is a customer who recently upgraded their relationship with you. These customers can help you answer questions like why did they upgrade now? And, what changed and pushed them to upgrade? What you want to identify here is a buying trigger. Triggers tell you how to move customers through the upgrade process more efficiently.
New Feature Activation
This is a customer who recently adopted a new feature. These customers can help you answer feature-specific questions such as:
Failed Feature Adoption
This is a customer who started using a newly released feature but then abandoned it. These customers can help you identify what is difficult about a feature, where it fails to meet expectations, and what pieces of the puzzle you may still be missing.
Feature Milestone Usage
This is a customer who has reached a specific “power user” level. They’ve hit some metric that marks them exceptionally successful with your product. This metric is usually quantitative, such as active days per month, number of actions completed, shares on social media, or total hours in the software.
Because power users often have very different usage from normal users, they can help you:
This is a customer who has recently left your product. They can help you understand:
Over time, surveying new cancellations will help you identify common churn patterns and prevent these, just like Typeform in the case study earlier in this lesson.
Some segments are very easy to identify. New signups, recent upgrades, and new cancellations should be apparent in every product or payment dashboard.
Other segments, such as new feature activation, failed feature adoption, or feature milestone usage will be easiest to see in third-party software. Tools like Mixpanel, Heap, and Amplitude all surface this information. Check your specific tool’s research page for additional guidance.
Once you know which segments you can survey, your next step is deciding which segment and what to ask.
In this lesson we will cover everything you need to know to create surveys that you can use to guide your content strategy.
You have to know how to ask the right questions
Surveys are probably the most flexible and easiest form of feedback to implement, which means a lot of people are just opening up Google or Typeform, plugging questions in and shipping it out to their audience.
Have tons of responses in a matter of minutes, hours, or days. Then boom, every decision you make from there will be data-driven and accurate.
Well… not exactly.
Let’s make sure you’re asking the right questions in the right way.
Write out all the questions you think you need. This isn’t the time to consider bias or format, just put the questions on paper.
This is when you want to get clear about what you need to learn from your survey.
Grab a whiteboard, a scratch pad, a blank Google doc or use this HBR brainstorming tactic and brainstorm the lessons you need to take away from this exercise.
Allow yourself to brainstorm freely, but try to be as specific as possible with the questions you need answered. The more specific the better.
For example, you might want to know “why is our churn so high?” but you’ll have difficulty getting a clear answer to such a broad and heavily nuanced question.
But if you want to find out “why so many customers churn after signing up, but before they’ve uploaded their first photo?” you’ll be able to create specific questions with specific answers.
This is the part where you cut out any unnecessary questions. Don’t waste your time or your respondents’ time asking non-vital questions.
Your survey has to pack a punch. It needs to gather as much accurate data as possible with as few questions as possible, because people are less likely to complete a long survey, no matter how much they love your product or service!
Only ask the questions that you really need the answers to. If it doesn’t matter how old the respondent is, then don’t waste the question space asking for their age.
Every respondent is doing you a favor when they take your survey, and they have just as many daily tasks and distractions as you do. Keeping it short means you only ask the really important questions.
SurveyGizmo says, “you should aim for 10 survey questions (or fewer, if you are using multiple text and essay box question types).”
Now, I don’t agree that you have to ask 10 or fewer questions, but there is such as thing as response burden, which means asking a lot of questions of your respondents can be really annoying, and they’ll just stop answering.
The response burden study was about people’s willingness to answer survey questions about their health. So, it’s a fair assumption to make that if someone isn’t willing to answer 75 questions about their current health status for their doctor, you have very little chance of getting them to answer that many about your company.
The best thing you can do is to tell your respondents how many questions they can expect and how much time the survey will take them to complete before they open the survey.
If you really need to ask 15 questions, then you should ask them.
If it’s a long survey, it’s a good idea to provide an incentive - a drawing for a gift card or a coupon code on completion. An incentive lets your customers know you value their time.
Bias is where you ask a question in a way that it leads the respondent to a particular thought or assumption just by how you’ve asked the question. There are a few ways to introduce bias in your questions. Here are a few examples I’ve come across.
A loaded question makes an assumption before gathering the right information. Here’s an example of a loaded question sent to me from Airtable.
You have endless opportunities to get your survey in front of your audience. So, it’s important to choose the one that will result in quality responses. This question assumes that Airtable provides me value. Airtable would have gotten better, more accurate data from me had they asked:
“Do you find Airtable valuable?” and left it open-ended or “On a scale of 1-10 how valuable do you find Airtable?” then asked me what I find valuable about Airtable.
Omit Double-Decker Questions
Because you want your survey to be as clear as possible, make sure you only ask one question at a time. Now, this may seem obvious, but double-barreled questions can easily sneak in.
A double-decker question is where one question has two or more components, such as...
Datacamp wants to know how much I know about their competitors, but the question itself asks me about subscription *and* product features for teams *and* businesses. Each *and* is an opportunity for confusion.
A better way to ask this question:
“Which of the following options do you recognize?”
“Of those brands, which one offers a subscription for businesses?”
“Which ones offer product features for businesses?”
“Which one offers subscriptions for teams?”
“Which one offers product features for businesses?”
When we break down their one question, it’s actually asking five different questions. No wonder I was confused!
A leading question prompts a specific answer. It creates inaccurate information because all it does is support the assumptions or feelings of the survey creators. Political surveys provide a great example of leading questions.
Here’s one from the Trump administration that was trying to gather feedback on Americans’ ideas of “mainstream media”.
This question is leading in a couple of different ways.
First, it makes the assumption that we all are in agreement of the definition of mainstream media. It doesn’t provide an example of how they are using the term mainstream media. It leaves it up to the respondent to determine what mainstream media means.
Second, it states “reported unfairly on our movement” using the word “our,” automatically introduces a bias. If this is “our movement”, that means the respondent plays a part in the movement. This small word of ownership skews all the responses.
A better way to ask this question would have been: “When you watched (name the news channels) did the reporting appear fair?
Motivated forgetting is when you ask a respondent to remember a recent purchase or action. But, if you’re not able to ask the survey question as the purchase is happening, using a website pop-up survey at checkout, for example, then you’re likely to get inaccurate recollections.
Now that you know how to ask the questions, it’s time to consider how you want to receive the answers. Here are the most popular ways to format your questions.
Open-ended questions allow the respondent to answer in their own words inside a text box. Most survey templates allow long-form answers (like a paragraph) and short text (like a sentence).
Open-ended questions are great when you need to hear the customer’s voice in the answer and learn the why behind their answer.
Reasons you may want to hear the customer’s voice:
Airtable asked two open-ended questions in their customer feedback survey.
These are great examples of when to use an open-ended survey question. They’re looking to hear the words their customers are using to explain their product to their friends and colleagues.
I love how they divided the question by industry knowledge. The insights discovered here will be helpful in all kinds of marketing and advertising copy.
But most of your questions shouldn’t be open-ended. If you’re asking mostly open-ended questions in a survey, you’d be better off asking a few customers to schedule a phone call, rather than releasing a survey.
Knowing the difference between open-ended and closed-ended questions (and the best time to use them) is an important part of a successful survey.
According to HotJar “Open-ended questions are broad and can be answered with detail, while closed-ended questions are narrow, multiple-choice questions that are usually answered with a single word or selection.”
Airtable has another good example of mixing open-ended and closed-ended question in its survey.
Question seven asks about favorite radio programs and offers a short text answer box, while number eight gives the option to check multiple boxes in order to answer the question.
Close-ended questions are questions that allow only certain kind of answers, such as; multiple choice, box-check, Likert Scale and nominal questions.
Most survey questions are close-ended. In fact, SurveyMonkey recommends asking mostly close-ended questions.
Ask close-ended questions when:
Here’s an example of a close-ended question from Adobe.
Had they left this question open-ended, their responses wouldn’t have been clear or easily organized.
When asking close-ended or multiple-choice questions, give the respondent an opportunity to choose “other” or “does not apply”. If you pigeonhole someone into answering you won’t get accurate information and you’re likely going to frustrate the respondent.
Chances are if you’ve taken a survey, you’ve answered a question using Likert Scale. Likert Scale asks you to rate your satisfaction on a number scale from less likely to extremely likely.
I love to buy my groceries online, and my HEB sends me a customer satisfaction survey after each purchase. (Which means I’m getting surveys once a week in my inbox, but that’s a whole other post about survey frequency and survey fatigue!)
Here’s a great example of a Likert Scale question:
Likert Scale questions are best kept specific to one topic, and can provide insights into the overall attitude your customers have about that topic.
HEB asked me about my overall satisfaction with their substitution experience. They didn’t ask about my overall online shopping experience, or my overall satisfaction with my online order. They kept the question very specific.
Choose the best format for each question and give your respondents variety!
If you’ve spent hours crafting the perfect questions you don’t want to throw them onto a plain template and call it a day. Use a great design help guide respondents through the survey.
When designing, keep in mind who is taking the survey. Are you sending this to high school seniors? Mid-level managers at work? Stay-at-home moms?
There’s a huge difference in the two survey examples below. Dedoose is a scientific research organization, so many of their respondents are scientists or in academia. Their design is very basic, maybe even a little bland.
Contrast it with the Fort Worth Museum of Science and History, whose survey went out to members of the children’s museum.
The gist of these two surveys was the same - they were polling their audience to learn about how the customer uses their product, but their design and tone were completely different!
Before you send your survey out into the world you’ll want to give it a test run. Choose a small group of people to take the survey - internal employees, friends or a select group of customers.
Let the testers know this is a test run, and that you’d appreciate their feedback on any confusing questions, wording or format.
Then, when you receive their feedback, make the appropriate changes.
So you’re just about ready to send the survey out into the wild. But, there’s still one more detail… How are you going to get the survey in front of your audience?
When we shared our first survey, we simply emailed a link to our subscribers, but there were so many other ways we could have reached our audience that we didn’t even consider!
An exit survey pop-up is timed to pop-up when your cursor quickly moves for the X button, or at an average time most people bounce from your website.
According to Survicate, exit survey response rates can vary from 5% to almost 60%.
That’s a huge variance but shows there’s an opportunity for successful responses if you ask the right questions, to the right customer, at the right time.
Here’s an exit-response survey pop-up I received while buying business cards from Vistaprint.
I’d love to see the response rate of these surveys. Vistaprint asked this before I had purchased the business cards, and it felt premature to me. I screenshot the popup and then quickly X’d out of this one to keep shopping.
SurveyGizmo found that email response rates can “soar past 85% (about 43 responses for every 50 invitations sent) when the respondent population is motivated and the survey is well-executed.”
Response rates can also fall below “2% (about 1 response for every 50 invitations sent) when the respondent population is less-targeted, when contact information is unreliable, or where there is less incentive or little motivation to respond.”
Moral of the story- write great email copy to an engaged audience.
The Fort Worth Museum of Science and History sent me their survey via email.
I’ve been a loyal member for eight years; I take my kids there frequently, so when they asked what matters to me about my membership, I was happy to take the survey.
Onboarding is a great time to ask your customers a couple of quick questions.
They’ve just agreed to start a trial or have signed up for your product. They know exactly why they need you and what convinced them to purchase at this moment. Since it’s fresh on their mind, why not ask - so you can have all that precious information too.
Here’s an onboarding survey question from Demio. This pop-up happened right after signing up for an account.
The question is simple, “Is your company currently running webinars?”
The answers they receive will help them segment their audience with greater accuracy and ultimately give each customer a better experience.
The thank-you page survey is crucial for ecommerce businesses, for the same reason that an onboarding survey is critical to subscription businesses - being able to learn more about your customers at that moment in their buyer journey helps you create a better experience for that particular customer and future customers.
Asking customers to complete a survey on your social channels should be given, because those are highly engaged customers who enjoy your product enough to follow you on social. And those are the kinds of customers you really want to hear from.
This is a great example from Candid Athletic Training asking people on Twitter to complete the survey. What I love about this ask, is that it specifies the number of responses they need.
Letting people know they need 22 more athletes helps to prevent social loafing. The idea, that in a large social setting, (like Twitter) someone else will do the work.
Pointing out that they are close to their goal, will help motivate athletes who haven’t yet taken the survey.
Here’s an example of a motivated forgetting question. Walmart wanted to know if I would recommend them to a friend, but only considering the purchase experience for one item.
Now, I’ve been to Walmart thousands of times. Do you think I was able to recall one specific instance and apply all my answers to that or was I using a lifetime of data to skew my response? Most definitely skewed.
Asking About the Future
“Asking your survey respondents questions about the future is basically asking to be lied to.”
This is so true.
Here’s an example of a company asking me about what I *might* be willing to pay in the future.
Currently, this museum offers free parking as a part of its membership fees. While I don’t know their results, I can only imagine that very few, if any, agreed to be charged more money in the future.
Our brains are constantly trying to make patterns and understand the relationship between two things. This is why question order has to be considered as a type of bias in survey design.
If you mention a product, experience or person in an earlier question, the respondent will likely apply that to all of the following questions.
Pew Research found this to be true in a 2008 political poll they conducted.
“All in all, are you satisfied or dissatisfied with the way things are going in this country today?” immediately after having been asked, “Do you approve or disapprove of the way George W. Bush is handling his job as president?”
88% said they were dissatisfied, compared with only 78% without the context of the prior question.
Mentioning George W. Bush in a previous question affected the way respondents answered later questions by 10%! That’s quite a huge statistical impact.
Bias is like a quiet toddler playing in the other room. You don’t realize it’s bad until it’s way too late. Prevent it at all costs, and if you notice it after the fact, you have to be willing to either dismiss the answers or at the very least not make huge business decisions based on them.
“Ask them what they care about. Ask them how they are measured in their role...you should know how your customers are evaluated in their performance reviews. You should know what they care about, what they wake up thinking about. Start there.”
— Maggie Crowley, Director of Product Management at Drift
If you put garbage into your survey, you’re going to get garbage out of your survey. To gather insights instead of trash, be strategic. Start with a specific problem and choose the right people and questions to answer that problem.
“Your objective directly impacts every aspect of the research, from the scope of the study down to the questions you ask. Spend the time upfront to define what you want out of it.”
— Jesse Caesar
You can’t afford to spend time and resources gathering insights for the sake of insights. You need insights to solve specific problems.
To find those insights, start with a painful problem you want to solve and to what end. For example:
Stating your problem and why solving it matters does three important things:
If you manage people or read much psychology, you know humans aren’t totally honest. They also don’t know what they want and are terrible about predicting their future behavior. So, how can you get reliable qualitative data from customers?
You ask smart questions.
“Rather than asking: ‘Why did you buy our product?’ ask ‘What was happening in your life that led you to search for this solution?’ Instead of asking: ‘What's the one feature you love about [product],’ I ask: ‘If our company were to close tomorrow, what would be the one thing you’d miss the most?’ These types of surveys have helped me double and triple my clients.”
Talia Wolf - Founder and Chief Optimizer at GetUplift
This, of course, is easier said than done. Especially when you consider the many kinds of questions you can ask in surveys. Question types include open-ended, close-ended, nominal, and rating scales. Each of these is useful in their own way.
To save you reading time and to make survey creation easier, we’ve put together learnwhy Playbooks. A playbook starts with a type of problem you’d like to solve (e.g. churn) and ends with the exact questions you can add to your survey.
Check out your options in the Playbooks gallery below. You can filter by problem you’re solving, lifecycle stage of the customer, or area of the company. We’ve covered who to survey and what questions to ask them. Now, one of your last decisions is where to launch your survey.
Choosing where you will survey customers is one of your last major hurdles. Common distribution options for SaaS companies include:
There are a few things you need to consider when you weigh these channels:
What do you have access to?
If the customer hasn’t provided their phone number, for example, that avenue isn’t an option. Or if you can’t easily segment your email list, a targeted email survey may be more headache than it’s worth.
How contextual does the survey need to be?
If you’re hoping to understand why a customer is adopting a particular feature, surveying them while they’re interacting with the feature is ideal. Likewise, if you want to understand why a customer upgraded, asking them on a thank you page right after they upgrade, or in-app after they adjust their subscription plan, is ideal.
Which customer segment are you targeting?
If you’re surveying churned customers, they’re no longer in the app. This rules out in-app surveys. If you’re surveying website visitors, your only option is on the website.
What do your customers prefer?
Some customer types may have particularly full inboxes, prefer text messages, or have an aversion to text messaging. As always, meet your customers on their terms, and keep their entire context in mind.
How much does response rate matter?
In-app surveys are easier for customers to complete and have higher response rates than email surveys. However, email surveys usually gather richer qualitative data because the respondent intentionally sets aside time to respond.
One other thing teams should note is the traditional in-app pop-ups and email surveys aren’t your only option. Particularly with the in-app approach, there are many creative ways to prompt customers.
Hiten Shah, for example, came up with a particularly clever way to answer new feature questions for his most recent business, FYI. “If we’re going to add a new feature,” Shah says, “we test it by adding a button for it first before we’ve built out any of the functionality. When people click the button, we ask them questions about their motivations.” This not only tells Shah how many people are interested in a feature (a quantitative measure), it tells him why people want it (a qualitative measure).
In short, don’t be afraid to break the mold when you engage with your customers!
“You need to come into qualitative market research with an absolutely open mind. If you’re fixed on a certain outcome, you’ll selectively read the output in your favor. The point of research is to be humbled by it — and inspired to do better.”
– Jesse Caesar
After you get your survey up and running, you’ll need a way to gather responses, organize them, and look for patterns. A few tips here:
LearnWhy analysis and tagging make it easy to organize feedback
Visuals are a fast and easy way to track patterns in your data