Tell me if this sounds familiar...
You just spent hours writing (and rewriting) questions and planning for your new user survey. You’re confident that the responses to these questions are going to give you and the product team the intel you need to make impactful product improvements.
You put the finishing touches on the survey, launch it, and get a response rate of about 1%.
Sure, you can use the data, but just think of all the biases and data quality issues that may be creeping in when 99% of everyone you sent the survey to didn't respond. Is that giving you and the product team the confidence you need to move forward?
So why does this happen? Better yet, how do we prevent this from happening in the first place? One way is to use contextual in-product surveys, triggered based on user actions and attributes within the product experience. Below, we’ll discuss five tactical tips for how you can get higher response rates from your users with these contextual user surveys. These tips will help you ensure that the resources put into your user surveys aren’t in vain (and help you save time while doing it).
Let’s get to it.
5 Tips for Boosting Your Research Survey Response Rate (using In-Product Surveys)
1. Set clear expectations up front
Regardless of the number of questions you want to ask, set clear expectations. Setting expectations is key to reducing the survey dropout rate of your respondents. For instance, how long will the survey take? If the user knows that the survey will take two or five minutes, they'll be more likely to want to participate. Here are more ways to set expectations:
- Write a short headline about the survey.
- Advise the user on the number of questions they will answer.
- Avoid surprises.
- Inform the user of any special requirements in your survey (like a request to respond using video).
- Explain why the survey is worth their time (i.e., how it will benefit them to take it).
- Disclose whether or not the information the user supplies stays confidential.
2. Lessen the cognitive load
The moment users have to think hard to answer your questions, you’ve lost them. Before launching your survey, double-check four things:
- Am I using “big” words or jargon?
Strive to eliminate “big” words from your surveys. They may make your questions difficult for users to understand. Instead, use simple words — words that an 11-year-old can comprehend. Similarly, avoid using industry language. It’s easy for us to understand what we mean by “onboarding,” “users,” and other technical terms. However, our users may not understand our jargon. So keep it simple.
Asking brief questions is also recommended to lessen the cognitive load of your users. Most people skim or scan over text. This makes paragraph-length questions a no-no. Likewise, your survey questions should contain less than 20 words where possible. As sentence length increases, reading difficulty also increases. This means your users may have to reread long sentences to understand your question.
- Am I asking questions users are capable of answering?
Don't ask users to remember something they did six months ago. Such info may no longer be relevant to them and they’ll think hard to remember if they decide to answer.
- Did I start my survey with a close-ended question?
After ticking off the three points above, check whether you started your survey with a close-ended or an open-ended question. It's best practice to start user surveys with close-ended questions, and then follow up with open-ended questions.
Here’s why:
Open-ended questions require more thought to answer, just by their nature. When users struggle to answer your first open-ended question, they could lose interest and skip the survey altogether. But when the first question is close-ended, they don’t need to think so much right off the bat. This allows them to get comfortable, making them more willing to answer any open-ended questions that follow.
The bottom line is that you can get high survey response rates when users can answer your questions easily, without much thought.
3. Don’t make users self-report what you already know
Asking your users to report what they expect you to know is a huge turnoff. Don't ask a free user what plan they are on. Likewise, try not to ask users for info on the last time they made a purchase.
Questions like this can frustrate survey takers and negatively impact response rates because users know you already have the data. Refraining from asking these questions means you ask fewer questions. And fewer questions often result in higher survey response rates.
If you must ask questions related to the data you have, let your users know that you have done the groundwork. For instance, you could ask, “How was your experience with the {product feature} in your Pro plan?” The key term is “Pro plan.” This way, the user feels the question is more relevant and is more likely to respond.
Making sure you send surveys only to the specific users you need to hear from can be challenging when you don't have easy access to that information. That's where contextual targeting shines. Contextual targeting - that is, surveying users as they use your product - allows you to choose your audience based on specific user characteristics or actions. Furthermore, it allows you to ask your questions in-the-moment, resulting in more specific, actionable responses.
4. Limit multiple-choice options
When creating a contextual or in-product survey, we recommend limiting your multiple-choice answer options to five. Too many options make it difficult for the human brain to process and evaluate responses.
In a study by Elena Reutskaja, Associate Professor of Marketing at IESE Business School, it was found that excessive alternatives cause people to delay decision-making. This phenomenon is called “choice overload.” For your users, choice overload may cause them to opt out of your survey altogether. You don’t want that, so stick with five multiple-choice options or less.
In addition, always include “other” as an option in your survey responses so that users can enter text if their replies aren’t in your multiple-choice options. If the same "other" response comes up frequently, you should do two things. First, use open text analysis to automatically analyze the “other” responses instead of wasting hours reading text responses. Next, consider updating the multiple-choice options to include the “other” response.
5. Use rich media for survey questions and responses
Every day, millions of online and tech users take Zoom calls. Millions of people also watch YouTube, Tiktok, and Instagram videos. In other words, many people are choosing video over other content formats. So why do so many companies ignore video as an opportunity for research?
The rise of video as a preferred format opens the door to video questions.
Because it's a much more conversational way to leave feedback (not to mention kind of fun), some users may be more inclined to use video than having to write out full-text responses.
And lastly, videos help researchers get deeper context into the user’s issues by learning through their facial expressions and body language. We don’t have to tell you how invaluable that type of information is in helping you further understand your users’ pains and challenges.
Why Contextual, In-Product Surveys Can Be Your Research Differentiator
Researchers have many ways of figuring out their users’ challenges. Ethnography, personas, customer satisfaction score (CSAT), NPS surveys, and user interviews are a few methods they use. All of these have their place. However, when you want to collect insights for products your customers use quickly, in-product surveys are your competitive advantage.
How? They elevate the quality of responses you get and allow you to conduct research any time you want. Specifically:
- In-product surveys allow you to capture accurate insights by targeting specific customer segments. You could target customers with criteria like event completion, time in-app, app version, and other custom attributes.
- In-product surveys allow you to generate user insights when they are engaging with your product. This means the feedback they give has a high degree of accuracy.
- Accurate targeting of users with in-product surveys catches them in-the-moment. This allows them to see that you understand their challenges and makes them say “aha!” The result? Users get more interested in your surveys.
- You can also research more often with in-product surveys. All you need to do is put together your survey questions and hit launch.
Wrapping up
So there you have it — by doing in-product surveys and following the tips above, you’ll be well on your way to boosting response rates in your next round of research.
Here’s a quick recap of the five tips:
- Set clear expectations up front
- Lessen the cognitive load of users
- Don’t make users self-report what you know
- Limit multiple-choice options
- Use rich media for surveys and interviews
Interested in learning how Sprig can support you in achieving greater user response rates faster? Get in touch with a member of the Sprig sales team today.
Or, while you’re here, check out how some companies have been able to achieve up to a 90% response rate using in-product surveys.