navy logo
Products
research core
Long form survey icon
Long-Form Surveys
Measure experiences with advanced customer surveys
ai recommendations icon
AI Insights
Summarize, find themes, and uncover what matters
digital experience
survey icon
In-Product Surveys
Capture targeted user insights right in your product
feedback icon
Feedback
Capture always-on feedback from any screen
digital Behavior
replays icon
Replays
Recreate and optimize user journeys across your product
heatmaps icon
Heatmaps
See where users click, scroll, and hesitate
Features
integrations
Integrations
mobile icon
Mobile
star icon
AI Analysis
magic pencil icon
AI Study Creator
dashboards icon
Dashboards
Solutions
by Use Case
continuously optimize icon
Continuously optimize
Analyze your users’ experience through core flows
solve pain points icon
Solve pain points
Uncover emerging trends in your users’ behavior
improve conversion icon
Improve conversion
Understand why and how users are dropping off
save time and resources icon
Save time & resources
Know which new features are worth the investment
by TEAM
uxr icon
User Research
Maximize the speed and impact of your research
Design
Validate and get buy-in for designs with real user insights
pm icon
Product Management
marketing
Marketing
code icon
Engineering
star icon
Customer Experience
Templates
lenny template
Survey
Develop Product Sense to Build Great Products
lenny headshot
Lenny Rachitsky
View Template
arrow icon
feedback template
Feedback
Continuously Collect Product Feedback
favicon
Sprig
View Template
arrow icon
Optimize New Features
Replay
Optimize New Features to Enhance the User Experience
favicon
Sprig
View Template
arrow icon
templates
Template Gallery
Discover how our team and community use Sprig templates to inform product development.
View All
arrow icon
Customers
square nav photosquare left logo
Square uncovered 100+ actionable insights within the first 6 months
Read Now
arrow icon
ramp nav imageramp logo
Ramp created customer-centric products with Sprig AI
Read Now
arrow icon
classpass nav photoclasspass left logo
ClassPass improved usability and retention by optimizing core user journeys
Read Now
arrow icon
users icon
Meet our Customers
Learn how top companies leverage Sprig user insights to boost conversion, reduce churn, improve onboarding, and more.
View All
arrow icon
Resources
blog icon
Blog
Get expert advice on capturing product experience insights
event icon
Events & Webinars
Learn from past Sprig events & register for upcoming ones
help center icon
Help Center
Explore our knowledge hub to get started
in Sprig
video tutorial icon
Video Tutorials
Get a crash course in Sprig with our guided
video series
Introducing Sprig Long-Form Surveys
Read Now
arrow icon
Pricing
Sign In
Book a Demo
navy logo
hamburger menu iconclose icon
Products
caret icon
Products
Long form survey icon
Long-Form Surveys
survey icon
In-Product Surveys
feedback icon
Feedback
replays icon
Replays
heatmaps icon
Heatmaps
ai recommendations icon
AI Insights
Features
integrations
Integrations
mobile icon
Mobile
star icon
AI Analysis
magic pencil icon
AI Study Creator
dashboards icon
Dashboards
Solutions
caret icon
By Use case
continuously optimize icon
Continuously optimize your product & website
solve pain points icon
Surface & solve pain points
improve conversion icon
Improve conversion rates
save time and resources icon
Save engineering time & resources
By TEAM
uxr icon
User Research
Design
pm icon
Product Management
marketing
Marketing
code icon
Engineering
star icon
Customer Experience
Templates
Customers
Resources
caret icon
blog icon
Blog
event icon
Events & Webinars
help center icon
Help Center
video tutorial icon
Video Tutorials
Enterprise
Pricing
Sign InGet Started Free
Blogarrow icon
Thought Leadership
arrow icon
A Research Prompt Framework that Fixes Generic AI Responses
Thought Leadership

A Research Prompt Framework that Fixes Generic AI Responses

Written by Anat Mooreville | Jul 30, 2025

July 30, 2025

A Research Prompt Framework that Fixes Generic AI Responses

How do you use AI in your end-to-end workflow? It’s still murky waters out there, and if you ask five different researchers this question, you might get five different answers. As enterprises prioritize efficiency and cross-functional teams adopt AI tools, the pressure mounts for researchers to adopt this technology thoughtfully without compromising rigor and quality of insights. 

But how? 

I got a glimpse at Learners’ annual Research Week, a free virtual and in-person UX research conference that is a summer highlight for many in the field. One of the most popular talks this year was Kaleb Loosback’s (Staff Researcher, Instacart): “The AI-Empowered Researcher: How to Dance with the Devil and Keep Your Soul.”

It’s a provocative metaphor. “Dancing with the devil” refers to engaging in risky or dangerous behavior, often with the potential for dire consequences. But rather than simply engaging in the oft-used metaphor that we should “partner” or “collaborate” with AI, Kaleb’s talk offers a concrete framework that anyone can use to mitigate the risk of flawed outcomes. I think his talk resonated so well with the crowd because he not only demonstrated the wide range of AI tools that he used across the research cycle, but also because researchers at any size company or stage of research maturity can apply his prompting methodology. 

The basic premise is: garbage in, garbage out. Poor inputs are exponentially more likely to produce poor outputs. Kaleb’s CRAFTe framework is focused on giving research and product teams a simple way to make their prompt instructions clear and effective. Here’s what each letter stands for: 

C - Context

Who are you? What’s the background of the study? What are the business goals and research objectives? Who are the most important stakeholders, and what previous research has been done? Kaleb specifically advises to feed the AI your research plan. Yes, you still need to come up with a research plan that clearly defines the importance of the problem, what questions you want answered, how you want to answer them, and what you hope to do with the results. 

R - Role

Help focus the AI’s mindset and domain expertise by defining what role they should play. For example: “Act as a Principal Researcher with a PhD in HCI.” That’s who you want to partner with, right? :) Depending on the actions, you might want to prompt it to take on the role as CEO of your company or the VP of your division if you’d like to pressure test executive responses. 

A - Actions

Rather than ask a sweeping, broad request like “Give me the top three insights from these interviews,” you need to give numbered instructions to ensure the AI doesn’t skip important steps. For example: First, review the research plan. Second, review the participant transcript. Third, generate a session summary, etc. 

F – Format

You or your company might already have a preferred research report template. You can dictate this template to AI to reduce post-processing time. It might include bullet lists, citation styles, and even tone (informative, action-oriented, and collaborative). You can even list the target audience (product managers, marketing, sales) to hone in on strategic implications. 

T – Template

When possible, supply an actual template—session-summary doc, transcript analysis table, etc.—to guide the AI’s structure. For example, you may want each takeaway to come with two key participant quotes as evidence. Kaleb’s pro-tip is to ask the AI to generate a template from an example of a finished product.

e – examples

Show what an “ideal” output looks like. Kaleb insists this is the biggest lever for quality: once the AI sees model outputs, expectations align. 

As you can tell, this prompting methodology is rigorous. It helps ensure, as Kaleb puts it, the researcher leads the dance with the AI, rather than the AI steering the researcher across the floor. In a panel discussion at Learners on “UX Research in 2030,” Jane Justice Leibrock, the Head of UX Research at Anthropic, also made the point that it is crucial to keep control of the collaboration: 

“I’ve noticed something. I think there can be a first-thought problem with AI where if it becomes your habit to automatically think of it to solve any problem you may have, I find that immediately just asking it the question… is not nearly as good as sitting and thinking myself first [and then] giving AI the context of what I care about and what my hunches are.” 

If CRAFTe helps you dance with the devil, how do you save your soul? Kaleb points out four main points. 

  1. Always review and validate outputs. It’s your reputation on the line, not the AI’s. 
  2. Be mindful of ownership, privacy, and ethical concerns. Are the inputs things you have the right to upload? Are you working in a secure and sanctioned environment so uploads will not train public models (see Samsung debacle)? Protect participant data and PII (personally identifiable information).  
  3. Actively review for stereotypes and generalizations. AI outputs can reflect the biases and injustices of our societies. 
  4. Be transparent and disclose AI use. Kaleb offers a concise disclaimer as an example: “The analysis was conducted using a combination of AI-powered data extraction and human evaluation and interpretation, ensuring both efficiency and accuracy.”

As more research teams explore AI to speed up synthesis and content creation, many are realizing that better prompts — not just better tools — are the key to getting high-quality results. In fact, Kaleb used 11 (!) AI tools across his research process, from intake and scoping through analysis and presentation. 

How does CRAFTe integrate with a tool like Sprig? Sprig also believes that AI works best as a research accelerator, not as a lead dancer. While Kaleb’s talk highlighted a qualitative case study, CRAFTe is equally applicable to surveys as well–he wrote me that “you can easily modify the prompt to assist with drafting screener surveys, coding open ends, or even assisting you with R code.” 

As surveys are often difficult to design and interpret correctly, AI has the potential to ensure this advanced methodology is thoughtfully practiced. I was intrigued by Arianna McClain’s (Head of UX Research at OpenAI) comments in a panel discussion at Learners on “UX Research in 2030” about surveys in particular. As an IDEO alum, where qualitative research is king, her words had even more weight: 

“I always thought you did qualitative research and what you learned helps you build a survey, and then you use a survey to understand people at scale. And I think that with AI, it really is possible to talk to people at scale… A really well-written open-ended response [survey] can really take you far… I’m really excited for more people to hear what people have to say, instead of going in with an hypothesis with what you think people want.”

At Sprig, we are too.

Resources:

See Learners’ Conference Agenda: https://joinlearners.com/research-week/ai-and-uxr/ 

Watch Kaleb’s talk: https://www.youtube.com/watch?v=K02PVXI9maM&t=11941s 

Read Kaleb’s example CRAFTe Prompts: https://www.heykaleb.com/musings/ai-empowered-researcher-framework?utm_source=sprig 

Sign up for our newsletter

Get the best content on user insights, design, and product management delivered to your inbox every week.

linkedin icontwitter icon

Written by

A Research Prompt Framework that Fixes Generic AI Responses

Anat Mooreville

She is is a design strategist and researcher who combines expertise in service design, qualitative research, and workshop facilitation to align stakeholders on user needs and inform strategic bets. Her experience spans financial services, healthcare, and innovation consulting, and she enjoys the great hiking in the Bay Area.

Related Articles

Using AI Analysis and Managing Resistance to AI
Thought Leadership
Apr 19, 2024

Using AI Analysis and Managing Resistance to AI

5 Tips for Leaders to Create a Culture of Experimentation in Product Teams
Thought Leadership
Aug 7, 2023

5 Tips for Leaders to Create a Culture of Experimentation in Product Teams

AI Experts Map Out Future Opportunities of AI
Thought Leadership
Jun 6, 2023

AI Experts Map Out Future Opportunities of AI

Sprig logo
Products
Long-Form Surveys
In-Product Surveys
Feedback
Replays
Heatmaps
AI Insights
Features
Integrations
Mobile
AI Study Creator
Dashboards
AI Analysis
Security Standards
Solutions
BY use case
Continuously Optimize
Improve Conversion
Solve Pain Points
Save Time & Resources
BY TEAM
User Research
Design
Product Management
Marketing
Engineering
Customer Experience
Templates
Customers
Resources
Blog
Events & Webinars
Help Center
Video Tutorials
Session Replay Guide
Pricing
Enterprise
Company
About Us
Careers
Sprig Service Agreement
Privacy Policy
Data Processing Addendum
Status
Compare
vs Qualtrics
vs Medallia
vs Hotjar
Copyright 2025 Sprig, All Rights Reserved
Linkedin iconTwitter icon