UserTesting hired me as a Senior Product Designer to lead design and research efforts that made remote UX testing seamless and simple for researchers, designers, and businesses as a whole.
My team (Customer Group) was responsible for improving core moderated and unmoderated testing products used by our customers, mainly research and product teams from start-up to enterprise businesses. This area was fraught with clunky experiences and design and tech debt that blocked customers or left them confused during key steps of their research and testing workflows.
Working closely with research, engineering, design systems, and data teams, we quickly swarmed on critical platform issues that negatively affected customer experiences and our business metrics the most. From assisting with business onboarding and discovery, crafting research readouts, and leading design sprints and workshops, to running usability testing, and guiding QA and handoff, I iteratively chipped away at monolithic issues that improved experiences over time.
Deeply understand our customer's needs, pain points, and JTBDs to deliver an excellent digital research experience that provides critical user insights for their business.
2 Years, October 2020 - November 2022
Director, Product Design
As Customer Group began planning our H2 2021 roadmap, user research and data teams met to examine customer feedback. A key problem emerged: lack of mobile test previewing and simulation:
“How can I preview the tests on a mobile device? I want to make sure that the images are sized correctly.”
“After building my screener questions, there is no way to preview them from the participant's perspective on a mobile device."
“The desktop preview app is different from what the users will actually experience, especially for mobile tests, so we can't fully validate our test plan from the preview."
Customers have difficulty understanding how their tests will appear and behave for their mobile audiences, and our desktop-based platform blocked them from doing so. This resulted in innacurate tests being sent to participants and wasted time and resources to revert and fix them.
Once I briefed with teams on this problem, I put myself in our customers' shoes and examined their workflow to creating mobile tests within our platform. Then I captured key moments and sentiment on a journey map.
When customers were blocked from previewing mobile tests, they "hacked" their way in doing so by creating separate tests, sending it to themselves, and opening it on a mobile browser. A frustrating, clunky experience for a relatively easy task.
I circulated this map and findings with my PM and stakeholders to gain buy-in on where we should focus our efforts, resulting in agreement and alignment on:
Journey map
With all of us on the same page, I facilitated a remote workshop with my team to crowd-source ideas, inspiration, and opinions on how we can tackle this problem. Engineers, product managers, designers, and researchers all participated.
One idea from our mobile engineers captured the group's attention: utilizing native tech to capture QR codes and simulate the mobile test experience directly from customers' devices.
This was bleeding edge functionality released for iOS, dubbed "App Clips", and Android, "Instant Apps" which allowed users to preview native apps without having to download them.
Customers could use this to capture a test's QR code and immediately simulate their test from their mobile device.
I worked closely with our mobile engineers to understand how this native tech would work in our existing platform and UX, while mapping out all the various possibilities along our customer testing journey.
In parallel, our engineering team went to work dissecting App Clips and Instant Apps respectively to get a better sense of their technical feasibility and inner workings under the hood.
As a result and gaining internal feedback with stakeholders, we decided this native tech was the best path forward, both in technical feasibility and viability for the project and roadmap. Now we just needed to measure user desirability.
As we're adopting a 3rd party solution, I needed to identify exactly how and where we'd integrate it into our existing Preview function for better discoverability and increased adoption, while maintaining scope.
Since App Clips and Instant Apps were triggered by a QR code, it made sense to embed it within our main Preview modal, prompted from the customers' main Test Builder. Then the prompt will guide the guest toward completing their test from their mobile device. After a few rounds of internal design reviews, I landed on a proposed version to test with customers.
Working closely with our lead researcher, we came up with a test plan and clear research objective: Understand and measure discoverability, value, and overall usability of the new Preview App Clip/Instant App function with customers.
I launched the test with 5 customer participants and the results were overwhelmingly positive:
Crafting design happy and sad paths of using native tech vs. app installed.
With feasibility, viability, and now desirability checked-off, I worked with my PM and engineering team to breakdown the design and user flows into digestible specs. By being detailed and thorough with annotations and UI specs, engineers were able to review them easily translate design into code.
I also created click-through prototypes to simulate what the desktop-to-mobile experience would be like.
After a month of building and testing, our engineering team was ready to slowly rollout the mobile preview feature to our working beta group of customers. This ensured that the new code and experience wouldn't break the existing environment and platform, while we monitor adoption and usage/metrics.
3 weeks later, everything functioned as desired.
The new mobile testing preview was launched as part of our company-wide Fall Release and immediately received high praise and adoption. Customers loved how seamless and intuitive it was to create their mobile tests on desktop and instantly preview them from a mobile device.
We even got a shoutout for it in our company-wide All Hands meeting as it was seen as a cornerstone project to addressing critical customer needs in a relatively short timeline.
KPI's | Outcomes |
---|---|
Mobile tests launched and recorded | +11% |
Mobile preview QR code trigger | 2.3 per session |
CSAT | 91% |
Time-on-task to preview > launch | 7.8 minutes (-43%) |