navy logo
Products
PRODUCTS
survey icon
In-Product Surveys
Capture targeted user insights right in your product
replays icon
Replays
Recreate and optimize user journeys across your product
teal icon of a survey with chapters
Long-Form Surveys
Measure UX at scale with advanced link surveys and AI analysis.
heatmaps icon
Heatmaps
Visualize user behavior in your product at scale
feedback icon
Feedback
Collect continuous user feedback at scale
ai recommendations icon
AI Insights
NEW
Sprig AI generates actionable product solutions
Features
integrations
Integrations
mobile icon
Mobile
star icon
AI Analysis
magic pencil icon
AI Study Creator
dashboards icon
Dashboards
Solutions
by Use Case
continuously optimize icon
Continuously optimize
Analyze your users’ experience through core flows
solve pain points icon
Solve pain points
Uncover emerging trends in your users’ behavior
improve conversion icon
Improve conversion
Understand why and how users are dropping off
save time and resources icon
Save time & resources
Know which new features are worth the investment
by TEAM
uxr icon
User Research
Maximize the speed and impact of your research
Design
Validate and get buy-in for designs with real user insights
pm icon
Product Management
marketing
Marketing
code icon
Engineering
star icon
Customer Experience
Templates
lenny template
Survey
Develop Product Sense to Build Great Products
lenny headshot
Lenny Rachitsky
View Template
arrow icon
feedback template
Feedback
Continuously Collect Product Feedback
favicon
Sprig
View Template
arrow icon
Optimize New Features
Replay
Optimize New Features to Enhance the User Experience
favicon
Sprig
View Template
arrow icon
templates
Template Gallery
Discover how our team and community use Sprig templates to inform product development.
View All
arrow icon
Customers
square nav photosquare left logo
Square uncovered 100+ actionable insights within the first 6 months
Read Now
arrow icon
ramp nav imageramp logo
Ramp created customer-centric products with Sprig AI
Read Now
arrow icon
classpass nav photoclasspass left logo
ClassPass improved usability and retention by optimizing core user journeys
Read Now
arrow icon
users icon
Meet our Customers
Learn how top companies leverage Sprig user insights to boost conversion, reduce churn, improve onboarding, and more.
View All
arrow icon
Resources
blog icon
Blog
Get expert advice on capturing product experience insights
event icon
Events & Webinars
Learn from past Sprig events & register for upcoming ones
help center icon
Help Center
Explore our knowledge hub to get started
in Sprig
video tutorial icon
Video Tutorials
Get a crash course in Sprig with our guided
video series
AI replay announcement text with a dashboard showing AI insights
New: AI-Powered Always-On Replays
Read Now
arrow icon
EnterprisePricing
Sign In
Book a Demo
navy logo
hamburger menu iconclose icon
Products
caret icon
Products
survey icon
In-Product Surveys
teal icon of a survey with chapters
Long-Form Surveys
feedback icon
Feedback
replays icon
Replays
heatmaps icon
Heatmaps
ai recommendations icon
AI Insights
Features
integrations
Integrations
mobile icon
Mobile
star icon
AI Analysis
magic pencil icon
AI Study Creator
dashboards icon
Dashboards
Solutions
caret icon
By Use case
continuously optimize icon
Continuously optimize your product & website
solve pain points icon
Surface & solve pain points
improve conversion icon
Improve conversion rates
save time and resources icon
Save engineering time & resources
By TEAM
uxr icon
User Research
Design
pm icon
Product Management
marketing
Marketing
code icon
Engineering
star icon
Customer Experience
Templates
Customers
Resources
caret icon
blog icon
Blog
event icon
Events & Webinars
help center icon
Help Center
video tutorial icon
Video Tutorials
Enterprise
Pricing
Sign InGet Started Free
Blogarrow icon
User Insights
arrow icon
The 5 Mistakes that are Ruining Your Survey Data
User Insights

The 5 Mistakes that are Ruining Your Survey Data

Written by Allison Dickin | Nov 23, 2020

November 23, 2020

The 5 Mistakes that are Ruining Your Survey Data

Surveys have a bad rap. Some people think the data from surveys isn’t useful, others believe they take too long to be worthwhile, and others believe survey responses are too biased to use.

The reality is that while surveys have their challenges, they can be extremely valuable, lightning-fast, and reflect the real experiences of your users… if you do them right. In this article, I’ll explain where people go wrong with surveys and provide some fixes to increase response rates and generate faster and more useful results. Follow these tips and surveys will be your new best friend.

‍

Surveying users at the wrong time

What it is

For survey insights to be most valuable, participants must receive the survey at a time when the questions you're asking are relevant and timely. Unfortunately, this doesn’t always happen. For example, maybe you want to survey new users about their onboarding experience with your product, so you send out an email survey to everyone that completed onboarding in the past 3 months. The problem? Users who onboarded 2-3 months ago are not going to have detailed memories about their experience and will provide less specific or helpful feedback, leaving you with fewer insights to take action on.

Why it happens to the best of us

While it’s possible to survey the wrong users simply through poor planning, we often do it because we have limited ability to target the right users or we have a small base of users meeting the criteria we care about. In the onboarding example, we may not have enough users completing onboarding each week to generate enough survey responses, or we may not have a tool to automate sending surveys when onboarding is completed, so we’re left with manually sending batches of survey invites as often we are able to manage.

How to do it better

The best way to survey users about a specific experience or about specific features of your product is to ask them questions in-context when it’s most relevant to them, and when the memory of the experience is fresh in their minds. This leads to higher response rates and more specific and actionable feedback.

Surveying users in-context often means surveying them in-product, and you will get the highest response rates this way (as long as you limit your survey to a few quick questions); but in-context surveys can also be administered by email if in-product isn’t an option for you.

In the onboarding example, an in-product survey would be great, but email surveys can work too, as long as the email is sent close in time to the relevant experience. Even sending onboarding surveys monthly is a big improvement over a 3-month cycle; weekly would be even better; and best would be triggering an email survey as soon as you log that onboarding has been completed. This can be done with tools like Sprig, that have automated email survey capabilities in addition to in-product surveys.

‍

Surveying the wrong users

What it is

We often survey the wrong users by asking our questions of a general audience, without targeting those who are most relevant to the questions we want to ask. For example, a colleague of mine recently received a survey asking him to answer questions about various features of a product he had recently signed up for, even though he hadn’t used most of those features yet. He probably didn’t respond, but if he did, his answers would have been pretty meaningless.

As a result of targeting the wrong users, you’ll often see low responses rates leading to slow data collection, as users are going to be less interested in taking surveys that don’t seem specific to their personal experiences. Your results may also be less actionable if your data is muddled by responses from users who aren’t relevant to your questions.

Why it happens to the best of us

Surveying the wrong users may happen despite our best intentions, when we have a limited user base to work with (leading us to try to get as many responses as possible, even from less relevant users), limited capacity to identify the right users to survey (leading us to send to a wide pool and rely on users to opt in correctly), or limited capacity to survey frequently (leading us to squeeze a few targeted questions into a larger survey covering other topics).

How to do it better

You can minimize the risk of surveying the wrong users by carefully choosing who you send your survey to and only targeting those who are best positioned to answer your questions (e.g., those who have used the features you are asking about). If you are unable to do this and must rely on users to correctly self-report their experiences, take some time to review who actually responded, so you know which users’ perspectives you’re working with when using the results.

If issues like a small user base or limited targeting capacity come into play, then an in-product survey tool can be helpful here as well, enabling you to ask relevant questions based on users’ actual behavior. In-product survey tools can be effective with small samples because they generate much higher response rates than email surveys, giving you quick insights from the right users, even if they are in limited supply. For example, one Sprig customer recently ran a survey focused on image-editing features. They displayed the survey in-product immediately after a user edited an image and identified multiple concrete opportunities to improve the experience in less than a day.

‍

Asking the wrong questions

What it is

Sometimes we ask complex or confusing questions; other times we ask questions that inadvertently bias our results. But people often find themselves believing surveys aren’t useful after they’ve asked questions that couldn’t provide the data they need to make decisions. This often happens when we ask users what they want instead of asking them what their problems are.

For example, if you ask users if they’d like you to build a certain feature and 75% of them say yes, it might seem like a pretty clear result, but it doesn’t actually tell you all that much. It doesn’t give you much insight into the value the feature would provide your users and the ROI you might expect to see if you were to build it. It would be more effective to ask users about the problem your feature is intended to solve: how big of a problem is it? And how valuable would it be to them to have that problem solved?

Why it happens to the best of us

It’s deceptively difficult to write survey questions effectively. While anyone can write a survey, not everyone can be a survey expert; however, survey expertise is just an exercise in a particular sort of common sense. Errors in survey design are often a consequence of an unconscious assumption that our users think like us and have the same information that we do. They don’t.

How to do it better

To write better survey questions, focus only on topics that users are experts in: their own experiences, pain points, needs, and goals. For more guidance, follow the 3 basic rules for writing survey questions, and review your questions to make sure you’re avoiding bias. Better questions will give you better data and more actionable, easier-to-interpret results.

‍

Asking too many questions

What it is

There’s so much we want to learn and it’s hard to know what questions are going to be useful (if we haven’t put in the effort to ask good questions--see #3 above). The problem? Long, repetitive surveys yield much lower response rates, potentially from a biased group of respondents, and data collection will take much longer. Not only that, it’ll take much more time to sift through all the results and pull out useful insights.

Why it happens to the best of us

We often send longer surveys because we have limited tooling that makes it cumbersome to survey users, making it something we can do only a few times a year.

How to do it better

The answer here should not be a shocker at this point in the article--shorter surveys with better, more targeted questions sent to users in-context will yield higher response rates and more actionable results. Another benefit of shorter surveys is the ability to iteratively build on results. Start with a few questions, analyze the results, and dig deeper with the short survey, all within a matter of days.

‍

Failing to fully analyze responses

What it is

In our rush to act on the results of surveys, we often don’t take the time to fully analyze the responses. Maybe you asked a few open-ended questions, but since you didn’t have time to manually tag each response, you just eye-balled it and hoped for the best. Or maybe you looked at the overall breakdown of responses but didn’t dig into key sub-segments of users who had unique responses.

Why it happens to the best of us

This can be unavoidable without the right tooling. Time is rarely on our side when it comes to user research, and we’re often forced to sacrifice rigor in order to stay on track with deliverables.

How to do it better

Open-ended analysis in particular can be extremely valuable, opening your eyes to user needs or use cases you had no idea existed and uncovering exciting opportunities. But it is a fact that manual analysis of open-ends is laborious and time intensive, unless you have tools like (you guessed it) Sprig to do the analysis for you. However, even without best-in-class tools, it is worthwhile to spend at least some time reviewing open-ended survey responses. If you commit to spending one hour, or even 30 minutes, reviewing responses to each survey you run, even if you don’t codify them manually, I promise you will find deeper, more valuable insights than you can get from the numbers alone.

Ready to step up your survey game? Start your free trial today.

‍

Sign up for our newsletter

Get the best content on user insights, design, and product management delivered to your inbox every week.

linkedin icontwitter icon

Written by

Allison Dickin

Survey fanatic and customer experience advocate. Former Research Director at Yale School of Management and Senior Research Manager at Etsy. Bucknell and University of Chicago alum.

Related Articles

Top Product Survey Questions for 2025
User Insights
Dec 19, 2024

Top Product Survey Questions for 2025

NPS Survey Best Practices: Maximize Insight and Drive Improvement
User Insights
May 10, 2024

NPS Survey Best Practices: Maximize Insight and Drive Improvement

Optimizing Your Survey Data: When to Use Open-Ended or Close-Ended Questions
User Insights
Apr 24, 2024

Optimizing Your Survey Data: When to Use Open-Ended or Close-Ended Questions

white sprig logo
Products
In-Product Surveys
Long-Form Surveys
Feedback
Replays
Heatmaps
AI Insights
Features
Integrations
Mobile
AI Study Creator
Dashboards
AI Analysis
Security Standards
Solutions
BY use case
Continuously Optimize
Improve Conversion
Solve Pain Points
Save Time & Resources
BY TEAM
User Research
Design
Product Management
Marketing
Engineering
Customer Experience
Templates
Customers
Resources
Blog
Events & Webinars
Help Center
Video Tutorials
Session Replay Guide
Pricing
Enterprise
Company
About Us
Careers
Sprig Service Agreement
Privacy Policy
Data Processing Addendum
Status
Compare
vs Qualtrics
vs Fullstory
vs Hotjar
vs Medallia
vs Pendo
Copyright 2025 Sprig, All Rights Reserved
linedkin logotwitter logo