Prototyping: low to hi-fidelity
designing an interactive and actionable report using eye-gaze, facial recognition (AI), and survey data
Overview:
Emozo is a startup. The team knew how to engineer products. They lacked the ability to collect market and user research data and use the data to inform their product strategy. Lastly, while they valued creating good UX, they didn’t know how to create easy-to-use and intuitive experiences. They sought an advisor to help build their first prototype.
I volunteered.
Role:
Product Manager/ UX researcher/Design Lead (2 Contract designers)
What did I do?
Uncovered user needs:
I studied competitive products and spoke to industry professionals who currently bought similar capabilities or sold services to deliver these capabilities. This helped me identify user needs and opportunity areas.
I found most potential customers relied on their intuition, which they thought was pretty great, and wanted just enough data to help them back a hunch.
I learned that while many software companies in the affective computing space are providing API solutions to collect data, none are providing a solution to make sense of the data. Most solutions lacked actionable reports.
I also learned that while potential customers were interested in using a tool that allowed them to collect attention and emotional data, they still wanted the comfort of collecting familiar survey data.
Shaped the product vision:
Emozo would build a DIY platform to inform intuition with the right customer data
Created a product strategy and roadmap:
To differentiate Emozo in a crowded customer feedback collection marketplace, I suggested the company focus on enhancing their report and boosting their rudimentary survey capability.
The report would be the true differentiator and delighter. It would be designed to allow different types of users, business people, creators, and researchers, to derive insights and take action quickly.
We would add just enough survey capabilities to help us achieve competitive parity with other survey providers.
A detailed look at the Report design:
Taking the prototype from 0 to 1 was an iterative process. The focus was on creating a report that was easily actionable and customizable by novice data users.
As a first step, a cross-functional team collaborated to create a customer journey map.
Then, the iterative process began.
Improve comprehension:
In the first version, the report has an area graph to show emotions. Users found this graph type hard to decode as different emotions were layered over one another. The area graph was replaced with a line graph.
Controls were added to allow users to simplify the view by de-selecting emotions they did not want to see.
Increase actionability:
After the team got a clearer understanding that the data was being used to shave off a few seconds from tested videos, we added Key Moments - highs and lows. This allowed users to quickly identify moments to shave off and moments to retain.
Most users struggled to tie together the 3 kinds of data. A score derived using attention, emotion, and survey data into one metric was added. The score allowed users to compare across variations that were tested.
The big question was: how much detail on the score calculation should we share with users? Customer feedback guided us again. Most users weren’t interested in the details. Over time, the way the score is shown was simplified.
Step 1: laying out use cases
Step 2: Iterating
Each iteration was made after user feedback was collected to make the report more usable and useful.
Within the carousel, click on the image so you can scroll down and see the full report.
Discoverability:
In the eye-gaze heat map, the colors mapping to eye-gaze concentration were not visible when the backgrounds were white. The colors were tweaked so the heat map colors show against videos with light backgrounds and dark ones.
Delight:
Added controls for the user to interact with their data and customize the report to tell a story.
The video player showing attention data has a play and pause button to allow the user control.
There is a zoom function that allows users to deep dive into specific sections of the attention and emotion graph in more granular detail. This is especially useful when longer videos are tested. Too many data points result in a bunched-up view with indistinguishable data points.
We made the Word Cloud interactive, so users could choose verbatims to show. They can search for words, tag them and allow only their selections to be shown. Users can also simplify the Word Cloud by limiting the number of words shown.
Controls to filter data for focus on specific demographic segments have been added.
When multiple options are tested in the same test, users can choose to simplify their view and see the data only for the winning option.
Outcome:
The platform had a soft launch in August 2021. In its first 3 months post-launch, Emozo has acquired 10+ trial customers and 2+ paying customers.
Customer feedback has been extremely positive for the interactive, actionable reports. Users, unfamiliar with data analysis are able to draw insights and take action.
Customer Testimonial:
Emozo has been really helpful in understanding the customer reaction/ perception of the brand ads that we did in 2021. The reports definitely helped us prioritize between different videos for different channels of distribution.
Sateesh, Digital Marketer, leading food delivery service in India