Lesson #1: It’s impossible to read your customers’ minds.
I started my career as the founding product manager at five different startups.
In December of 2011, I joined Bobby Lo as the first PM at Vurb, which was later acquired by Snapchat.
Vurb was a consumer search app built around discovery and sharing. I knew that for our customers to see the Vurb’s value and become loyal users, we needed to quickly fix any negative customer experiences.
Getting to the root of those problems was my mission.
So I started asking for feedback from our early users. I made assumptions based on our analytics and some internal hypotheses and sent surveys to gauge interest in features and understand points of friction.
And something really interesting happened…
When I asked users to choose which features they wanted most or what we could fix, almost everyone filled out the “Other” category. And they used their own words to tell us something we’d never even thought about.
One of our biggest product breakthroughs came to us this way. See...we initially built Vurb to be a search product, but our customers ended up telling us they actually valued the curation of the content (similar to Pinterest) more than the search functionality. After hearing those responses over and over, we began to orient the new user experience around finding other users' content instead of discovering content for themselves. We had a hunch it would improve retention, but were blown away when retention improved by 74% after implementing the changes.
That’s when I realized how powerful open-ended responses could be.
Lesson #2: Analyzing a lot of qualitative data is…HARD.
In 2014 I left Vurb and joined Weebly, again as the first PM.
I already knew the importance of qualitative data and letting your customer do the talking.
But with 50 million users (many more than Vurb), I quickly realized collecting and analyzing qualitative data from all of our customers was going to be incredibly time consuming.
Luckily, we had a user research team. But that just meant our head researcher, Lyssette, would have to spend days digging through survey responses, something even she didn’t have time to do. And we were releasing products so frequently that by the time she was done, we’d already shipped the feature release or UX change in question.
So we were left to make guesses...or listen to the loudest opinion in the room. And at Weebly, LOTS of people had opinions. Like our free power user Sean, who even found my Facebook profile and messaged me feature requests. Needless to say, his opinion wasn’t representative of our paid users who ended up wanting very different features.
Lesson #3: Customer research tools haven’t kept up with the modern PM tech stack.
I was left feeling frustrated. In so many aspects of my life as a PM, I had the right tools. Analytics, A/B testing, heat mapping.
But with customer research, it all felt broken.
Analytics couldn't tell me why customers were doing things. Traditional user research and surveys took forever. And relying on individual customer feedback could quickly point us in the wrong direction.
And after spending countless hours talking to other PMs and UX researchers, it became apparent I was not the only one who felt this way.
So in 2018, I decided to leave my job at Weebly and create a product that would solve this problem.
I started on my journey to create a modern research tool. One that...
- Was simple to implement and easy to use
- Could survey customers right within product experience
- Would analyze qualitative responses for you and surface actionable insights
With several years building consumer products under my belt, I already had a few ideas about how to make #1 and #2 a reality. My next big task was figuring out how to build a product that could deliver on #3.
So how do you generate qualitative insights at scale?
To build a product that would automate the process of collecting and analyzing qualitative, survey responses, I knew we’d need to use artificial intelligence and build a machine learning model.
Within a few months, I’d convinced Kevin Mandich, our first data scientist, to help me build it.
We got to work, teaching our model how to recognize thematic similarities from qualitative survey responses, even in the absence of overlapping words and phrases. Kevin programmed the model to use deep neural networks to consider the entire context of the responses, as opposed to looking at individual words or phrases.
If that sounds confusing, this graphic might help you understand:
Once that was completed, we programmed the model to analyze the themes that emerged and surface insights to monitor or take action on….like this:
With our AI/Machine learning model complete, we were almost ready to launch. After a few more months of working around the clock, we finally had a product.
It took a while to find our footing, but since launching earlier this year, we’ve had over 260M visitors tracked and 400K survey responses captured by companies like Square and Codeacademy.
Take it for spin and let me know what you think.
I created Sprig to provide PMs and UX researchers with a tool I wish I had earlier in my career. For that very reason, we’ve made it easy (and free) to get started.
You can install the Sprig SDK on your website or app, launch your first survey, and start collecting data about your users in just a few hours. And if you’re not sure what to ask, you can try out 75+ free micro-survey templates, which contain a series of survey questions created by our VP of Research, Allison Dickin.
Not surprisingly, I love to hear what customers think, so feel free to shoot me a note at firstname.lastname@example.org and let me know how you like it!