Fact: no one enjoys used car shopping.
Between the searching, test driving, and (dare I say it) negotiating, it can be an exhausting undertaking.
Well, that was until Shift came on the scene. Since 2013, Shift has been helping used car buyers and sellers “skip the headaches of used car buying” by providing a technology-driven, hassle-free customer experience – including intuitive search recommendations, test drives delivered straight to their door, with fair, no-haggle pricing and great financing options.
This cult-favorite online marketplace is credited to years of testing and, you guessed it, user research.
However, the team hadn’t always relied so heavily on customer research.
For a long time, the team relied solely on quantitative data and metrics, which allowed them to measure the success of new initiatives – but this didn’t help them understand what actually made a new feature successful or not.
“For example, we just launched a feature that we thought was a home run, but then the data was a flop…” Adam Johnston, Director of Product Management at Shift, recounts, “What happened? How could this world-changing feature not have blown our customers away? Why do they not care about it? Or why did the numbers go down? What happened?”
This realization of just how limiting purely quantitative research can be led Adam to jumpstart Shift’s qualitative research program.
Shifting into Research Led
In 2017, Shift implemented SurveyMonkey to start collecting qualitative data and customer satisfaction feedback during the online car shopping experience.
The survey was delivered via email to shoppers who had submitted their email on the website, therefore generating a lead, one day after their session. The survey included two questions:
1) How satisfied are you with your shopping experience on Shift
2) What could we do to improve?
Adam recalls that “SurveyMonkey’s capabilities at the time didn't capture our business needs and so we kind of landed on like, okay, what other platforms are out there that would enable us to conduct the surveys and do a little bit of analysis on the data that we get back from them.”
The results the team received from the SurveyMonkey study were grim – with only 30% of respondents reporting they were satisfied with their car shopping experience on Shift.
At first glance, this was a startling data point. However, after manually reviewing the open-text responses received, Adam noticed that most of the responses were not applicable to the online shopping experience and that there was “so much noise that it wasn’t even worth [his] time to review the results.”
The team concluded that the methodology was off and the survey was excluding a lot of people who the team actually wanted to hear from, like people who browsed for cars on the site but never gave an email address.
However, SurveyMonkey didn’t offer the capabilities the team needed to narrow their survey’s audience base how they needed. That’s why the team switched over to Sprig.
The Switch to Sprig
The team began working with Sprig’s Head of User Research, Allison Dickin, to refine and optimize the methodology to ensure the correct people were receiving the survey during their shopping experience.
Shift wanted to get feedback from users who were actively browsing for cars on the Shift website, rather than users who had submitted their email address – signaling that they were at the tail end of their shopping journey and submitted a request to purchase or test-drive a car.
To do this, the team leveraged Sprig’s web delivery which allows them to surface their customer satisfaction “microsurvey” to eligible users while they are actively browsing for cars on the Shift site. To qualify, users needed to:
Land on Shift’s car browsing directory
Spend 2+ seconds on the page
Have not submitted an email address to Shift
Shift’s customer satisfaction survey was then delivered to 5% of users who met these qualifications (Shift has a lot of users!) – directly within their browsing experience.
The Results: A 340% increase in response rates (plus hearing from the right users!)
As a result of the updates and in-product delivery, Shift experienced a 340% increase in responses after only 6 months – with over 13,000 responses collected within 6 months of launching, compared to 12,000 responses collected over 2 years with SurveyMonkey. Not to mention, they were all from the correct, qualified user base.
Adam recalls that “the insights from SurveyMonkey created a narrative that the shopping experience was negative and people hated it, when in reality the metrics with Sprig told a much different story” with 77% of respondents saying they were satisfied or extremely satisfied with the car shopping experience.
These updated (and accurate) stats brought a renewed sense of optimism and pride to Shift’s product team.
With the info from SurveyMonkey, “there was a bit of a narrative that had developed internally at Shift that everything was horrible – our website, our shopping experience, you name it. And, while yes everything can absolutely be improved, the truth is that our customers are actually really happy with the shopping experience and it’s actually a lot better than most of what else is out there,” says Adam.
This realization gave the team new baseline metrics to work off of and a clear path forward on how they can continue to improve the shopping experience as they also received hyper-relevant insights from automatically analyzed, open-text responses.
The team uncovered that users care most about features that will help them find the perfect car for them, like search filters and in-depth descriptions, and everything else about the shopping experience second.
They turned these learnings into new projects to revamp the initial car browsing experience with the addition of, the highly requested, more search filters. Shift also introduced a new and improved recommendation algorithm to ensure you are always getting matched to the correct car for you.
The team also refreshed the way car details were shared with updated descriptions, specific feature checklists, 360-degree car tours and photos, and additional information on how prices are determined for each car.
“We’re getting gold out of this survey now. The SurveyMonkey survey wasn’t even worth me looking at because there was so much noise and it was so disconnected from the product experience we were trying to evaluate – but with Sprig, it’s all relevant to what we are working on or should be working on,” Adam says.
The team is hungry to receive more “gold” insights from Sprig and has since expanded their in-product research program to include qualitative studies across the entire buying and selling journey on Shift.
How you can replicate Shift’s success
Obtaining qualitative data in context: Capture hyper-relevant insights by targeting specific users during specific moments throughout the customer journey. This means asking for feedback in the moment — when they are using your app or website.
Considering where users are in their journey: Don’t lump everyone together in one study! Instead, customize the audience for the question(s) you are trying to answer. For Shift, this meant asking different questions based on what the shopper had accomplished on their website – were they just browsing? Did they sign up for a test drive? Did they complete a purchase? By understanding where the customers were in their journey they were able to receive much more relevant and actionable insights.
Capturing data continuously: Your customer and product are always changing, so you should be capturing constant insights to keep up with those changes. Continuous findings mean constant learning and quick iteration. All of which benefit users and, in turn, the organization