navy logo
Products
research core
Long form survey icon
Long-Form Surveys
Measure experiences with advanced customer surveys
ai recommendations icon
AI Insights
Summarize, find themes, and uncover what matters
digital experience
survey icon
In-Product Surveys
Capture targeted user insights right in your product
feedback icon
Feedback
Capture always-on feedback from any screen
digital Behavior
replays icon
Replays
Recreate and optimize user journeys across your product
heatmaps icon
Heatmaps
See where users click, scroll, and hesitate
Features
integrations
Integrations
mobile icon
Mobile
star icon
AI Analysis
magic pencil icon
AI Study Creator
dashboards icon
Dashboards
Solutions
by Use Case
continuously optimize icon
Continuously optimize
Analyze your users’ experience through core flows
solve pain points icon
Solve pain points
Uncover emerging trends in your users’ behavior
improve conversion icon
Improve conversion
Understand why and how users are dropping off
save time and resources icon
Save time & resources
Know which new features are worth the investment
by TEAM
uxr icon
User Research
Maximize the speed and impact of your research
Design
Validate and get buy-in for designs with real user insights
pm icon
Product Management
marketing
Marketing
code icon
Engineering
star icon
Customer Experience
Templates
lenny template
Survey
Develop Product Sense to Build Great Products
lenny headshot
Lenny Rachitsky
View Template
arrow icon
feedback template
Feedback
Continuously Collect Product Feedback
favicon
Sprig
View Template
arrow icon
Optimize New Features
Replay
Optimize New Features to Enhance the User Experience
favicon
Sprig
View Template
arrow icon
templates
Template Gallery
Discover how our team and community use Sprig templates to inform product development.
View All
arrow icon
Customers
square nav photosquare left logo
Square uncovered 100+ actionable insights within the first 6 months
Read Now
arrow icon
ramp nav imageramp logo
Ramp created customer-centric products with Sprig AI
Read Now
arrow icon
classpass nav photoclasspass left logo
ClassPass improved usability and retention by optimizing core user journeys
Read Now
arrow icon
users icon
Meet our Customers
Learn how top companies leverage Sprig user insights to boost conversion, reduce churn, improve onboarding, and more.
View All
arrow icon
Resources
blog icon
Blog
Get expert advice on capturing product experience insights
event icon
Events & Webinars
Learn from past Sprig events & register for upcoming ones
help center icon
Help Center
Explore our knowledge hub to get started
in Sprig
video tutorial icon
Video Tutorials
Get a crash course in Sprig with our guided
video series
Sprig icon on gradient background
Introducing the New Sprig
Read Now
arrow icon
Pricing
Sign In
Book a Demo
navy logo
hamburger menu iconclose icon
Products
caret icon
Products
Long form survey icon
Long-Form Surveys
survey icon
In-Product Surveys
feedback icon
Feedback
replays icon
Replays
heatmaps icon
Heatmaps
ai recommendations icon
AI Insights
Features
integrations
Integrations
mobile icon
Mobile
star icon
AI Analysis
magic pencil icon
AI Study Creator
dashboards icon
Dashboards
Solutions
caret icon
By Use case
continuously optimize icon
Continuously optimize your product & website
solve pain points icon
Surface & solve pain points
improve conversion icon
Improve conversion rates
save time and resources icon
Save engineering time & resources
By TEAM
uxr icon
User Research
Design
pm icon
Product Management
marketing
Marketing
code icon
Engineering
star icon
Customer Experience
Templates
Customers
Resources
caret icon
blog icon
Blog
event icon
Events & Webinars
help center icon
Help Center
video tutorial icon
Video Tutorials
Enterprise
Pricing
Sign InGet Started Free
ONLINE GUIDE

Get Support for User Research: Your Guide to Demonstrating Impact

Get Support for User Research: Your Guide to Demonstrating Impact

Contents

Introduction
The state of user research
What is research impact and why does it matter?
Why is it hard to effectively measure research impact?
A Framework for Demonstrating Impact
Putting the Framework to Work
Summary

In a rush? Download the PDF for later

Download the PDF

Introduction

Demonstrating the impact of user research is critical, particularly in an economic downturn when business leaders are considering cutting certain departments. So how do user researchers define and then communicate the impact of their research?

Using the work of leading UX researchers, we’ve put together this guide to help researchers demonstrate the value of their work.

user journey illo

The state of user research:

This year, the practice of user research was defined by an increased use of tools, including AI, and a drastic increase in layoffs, according to a report by User Interviews.1

Tl;dr
  • User research programs were heavily affected by tech layoffs
  • The average researcher has a robust tech stack with more than 10 tools
  • Despite fear of AI, researchers are starting to use the technology

The average user researcher is relying on 13 different tools for their research. At the core of the tool stack are general-purpose tools, said the report.

“General-purpose tools play a crucial role in all stages of user research, forming the backbone of most tool stacks. Figma tops the list—a whopping 81% of researchers use the tool for prototyping/design, while 31% use FigJam, the company’s flexible whiteboarding tool.”

While there is general trepidation around AI, some researchers are already embracing it with about a fifth of researchers using the technology or planning on using it in the future.

This year was also defined by layoffs for user researchers. User research programs suffered some of the greatest layoffs with companies like Amazon letting go of entire research teams.2

According to the report from User Interviews, half of researchers were directly or indirectly affected by layoffs in the last 12 months.3

While these layoffs are alarming, they don’t necessarily indicate a long-term deprioritization of user research programs. User insights are the key way to understand what your users think and what they want.

As companies assess the need for user research programs, a key way to show the importance of research is to develop a methodology for showing research impact.

What is research impact and why does it matter?

Simply put, research impact is the tangible, measured results of research. As a user researcher, tracking and measuring the impact of your work will help you:

  • Make sure stakeholders appreciate the value of research
  • Make a case for additional reserach resources - whether that's additional tooling or more researchers
  • Understand whether to invest in more (or different) research methods
  • Know which research project has the most impact on product roadmap4

Overall, demonstrating impact increases the credibility of your research team. In turn, you can make additional investments —and you may also be invited to help make strategic decisions about the future of your product or service.5

Why is it hard to effectively measure research impact?

Research is one of the hardest disciplines to measure. With a discipline such as engineering, the measurement can be churn rate or feature adoption, but the measurement of research is more complicated. Let’s cover some of the top reasons for those complications:

Intangible Results: The ultimate goal of research is to help product teams and stakeholders better understand users, but that goal is hard to measure. It’s easy to tell if a feature launches on time, but it is more difficult to tell if stakeholders better understand users and are applying that understanding to build better products.

Time Between Research and Application: It can be easier to document the impact of evaluative and tactical research (like concept or usability testing), particularly when it informs a change in the product or experience before it’s shipped.But for foundational research, it might take months or years to see an impact on the product. Given the wider time gap between the initial research and the analysis of the results, it’s much easier to let documentation slip through the cracks.

Lack of a Framework or Process: According to Allison Dickin, lead researcher at Relief App, the key reason that researchers don’t evaluate impact is the lack of a framework. A framework is a systemic approach to identifying, documenting, and tracking the impact research has on an organization. “A well-grounded framework and process can help teams consistently document their work and effectively demonstrate impact,” said Dickin in an interview with the Sprig editorial team.

Time Limits: Limited time can be one of the biggest barriers. Research teams have busy schedules and often need to wrap up a current project in order to focus on the next  priority—which is usually another big question to answer. Time limits make it difficult to document and properly contextualize research results.

A Framework for Demonstrating Impact:

There are dozens - even hundreds - of frameworks for user research, and the number of frameworks can make it difficult to discern which one makes the most sense for your product team.

Framework: Systemic approach to identifying, documenting, and tracking the impact research has on an organization.

We’ve selected a framework designed to demonstrate impact, specifically during an economic downturn when user research may be deprioritized.  This framework is inspired by a talk by Victoria Sosik, Director of UX Research at Verizon. In the talk, Sosik outlines three key elements to a framework:

  1. Research Activity to Drive Impact: Define the specific research study or cross-functional research workshop. This includes evaluative research, generative research, iterative research and company initiatives. 
  2. Impact or Recordable Instance of Influence: Sosik tracks eight different types of impact, including influenced product change, influenced product strategy, and stakeholder participation. 
  3. The Scale of the Impact: How you define and categorize the scale of impact depends on the size of your company. The impact scale runs the spectrum from individual stakeholders to the organization and beyond.6

Once you’ve determined the individual components, log these into a tracking document using a tool like Google Sheets — here’s a template for you to use.

impact tracker chart
FREE TEMPLATE FRAMEWORK

Template Framework for Demonstrating Research Impact

Open in Google Sheets

Putting the Framework to Work:

To demonstrate how to use this framework, let’s look at how Brandie Smith, former lead researcher at WayBetter and Metromile, took Sosik’s model and put it into practice.7

Smith recently sat down with us for an episode of our People Driven Products podcast. In an interview with Sprig, she walked through exactly how she uses Sosik’s model day to day.

1. Note the Impact

Start with a descriptive note of the observed impact, along with the date. For example, an entry by Smith might read: “I led leadership through a 10-minute tagging exercise of NPS comments. CEO slacked me, ‘this is such a smart way to engage leadership.’”

It’s important to note here that the impact might be qualitative feedback such as a note from leadership or an email from stakeholders. Not all impact has to be hard metrics such as improving churn rate.

2. Document the Research Activity

Record the name and date of the research activity—such as a study, a workshop, or a document—that this impact came from. The key here is that it’s not just a deliverable but an activity that encompasses all stages of research as a whole.

In Smith’s case, she leads a monthly meeting where she and a cross-functional team compile a holistic view of the customer experience, which is then presented to leadership. For this, Smith might write down that the research activity was presenting "Product's NPS Tagging Exercise during the Monthly Customer Experience Leadership Meeting.”8

3. Determine the Focus Area Affected

Make a note of the main research focus area affected by this impact. When personalized to your organization, this might include:

  • Lanes of work (eg. mobile experience, customer experience)
  • Teams/squads (eg. core product, growth product)
  • Product areas (eg. enrollment, claims)

4. Note the Type of Research Activity

Sosik’s original template included evaluative research, generative research, and iterative research. In her impact tracker, Smith adds one additional category: “company initiatives.” This distinction allows her to track the impact of internal research as a separate category from product research.

For the example above, Smith would mark her entry as an iterative research activity since it stems from a monthly meeting she leads. She considers presenting NPS during the monthly Customer Experience Leadership meeting to be an iterative research activity because "it involves a systematic repetition of a sequence of tasks executed in exactly the same manner multiple times, provides a deepening understanding of research data and brings a standard of reliability to the research."

5. Select the Impact Type

Then, decide the type of impact that was observed. Categories in Sosik’s framework include:

  • Influencing a product change
  • Influencing a product strategy
  • Increasing stakeholder exposure to users
  • Sharing communications
  • Prompting further research
  • Prompting a new collaboration
  • Elevating the role of user research
  • Developing infrastructure

In this case, since Smith received praise from a harder-to-reach senior executive, she considered it as elevating UX research status. Praise is always worth documenting as it helps to build the credibility of research - especially when you and your team are just starting out.

More established teams might focus more on product strategy impact. In order of scope, here are some of the best types of impact to look for:

  1. A usability issue identified in a prototype test is corrected before launch
  2. The project manager mentions insight from research in a meeting about product strategy
  3. The CEO mentions insight from research in an All-Hands meeting
  4. Insight about users leads to changes to roadmap
  5. Insight about users leads to changes to product strategy

6. Determine the Scale of Impact

Lastly, note the scale of the impact observed. Smith’s choices include:

  • Individual stakeholders
  • Individual project or team
  • A larger body of work
  • Organization/company
  • Beyond the company (like the broader UX community)

For the example above, Smith would select “organization” because it impacted a group of people rather than a body of work.9

In Summary

While it’s crucial to have a framework to demonstrate impact, there are also tools to help measure impact. In-product Surveys from Sprig enable researchers to place surveys directly in the product or app and get direct feedback from users.

With this type of feedback, researchers can show the value of research by identifying pain points, decreasing churn and optimizing user journeys, and more easily tie research to product strategy goals. Learn how research teams at Square and more are using Sprig to build the products people love: sprig.com/customers.

1 User Interviews (Accessed August 2023) The State of User Research. Retrieved from https://www.userinterviews.com/state-of-user-research-2023-report
2 Chain Store Age (Accessed September 2023) Amazon Eliminates Shopping User Experience Research Team. Retrieved from https://chainstoreage.com/amazon-eliminates-shopping-user-experience-research-team
3 User Interviews (Accessed August 2023) The State of User Research. Retrieved from https://www.userinterviews.com/state-of-user-research-2023-report
4 Sprig (Accessed September 2023) Framework for Demonstrating Research Impact. Retrieved from https://sprig.com/blog/framework-for-demonstrating-research-impact
5 Sprig (Accessed September 2023) Framework for Demonstrating Research Impact. Retrieved from https://sprig.com/blog/framework-for-demonstrating-research-impact
6 Join Learners (Accessed September 2023) Impact UX Research. Retrieved from https://joinlearners.com/talk/impact-ux-research-what-is-it-and-how-do-we-know-weve-achieved-it
7 Sprig (Accessed September 2023) Framework for Demonstrating Research Impact. Retrieved from https://sprig.com/blog/framework-for-demonstrating-research-impact
8 Sprig (Accessed September 2023) Framework for Demonstrating Research Impact. Retrieved from https://sprig.com/blog/framework-for-demonstrating-research-impact
9 Sprig (Accessed September 2023) Framework for Demonstrating Research Impact. Retrieved from https://sprig.com/blog/framework-for-demonstrating-research-impact

Unlock your product’s full potential

See the Sprig platform for yourself and start collecting game-changing insights today.

Get Started FreeBook a Demo
Sprig logo
Products
Long-Form Surveys
In-Product Surveys
Feedback
Replays
Heatmaps
AI Insights
Features
Integrations
Mobile
AI Study Creator
Dashboards
AI Analysis
Security Standards
Solutions
BY use case
Continuously Optimize
Improve Conversion
Solve Pain Points
Save Time & Resources
BY TEAM
User Research
Design
Product Management
Marketing
Engineering
Customer Experience
Templates
Customers
Resources
Blog
Events & Webinars
Help Center
Video Tutorials
Session Replay Guide
Pricing
Enterprise
Company
About Us
Careers
Sprig Service Agreement
Privacy Policy
Data Processing Addendum
Status
Compare
vs Qualtrics
vs Medallia
vs Fullstory
vs Hotjar
vs Pendo
Copyright 2025 Sprig, All Rights Reserved
Linkedin iconTwitter icon