Working as a contract VP of Product for Ditto, I supported them in a push to deliver refine glasses selection and fitting experience driven by augmented reality. The work had to present online via a browser as well as within optometrist's show rooms via a native iOS app on an iPad. 
One of my challenges for this in-store digital experience, was to simplify the complex process of scanning a customers head in 3D as part of creating AR views of glasses on their faces. 
Secondarily  product search pages, product detail pages, and comparison pages within the commerce flow needed a visual design update as well as better UX. 
I had the support of a  UX designer and engineering as well as control of product specifications. At the same time it was necessary to lead cross discipline product development meetings and decision making.

FACIAL SCANNING TO CREATE 3D MODEL
This movie shows how the AR glasses are rendered. Zenni uses Ditto's tool to help customers see glasses on their own faces or on the faces of models Zenni has prescanned.
I directed a designer in creating the interactive elements for Zenni you see here and the design of the feature that allows customers to see glasses on their own faces. 
FACIAL SCANNING TO CREATE 3D MODEL
We started the AR experience with a visual scan that created an accurate and scaled 3D model of the customer's face and head. The customer had to be guided in turning their head from side to side in a precise way. This screen helped them to do this simply by following  a serious of dots moving back and forth across the screen.
Accurately scaled holograms of the glasses would then be placed on the face. The result, as you can see in the Zenni interface, is a head that can be panned back and forth with accurately scaled glasses on it.

BEFORE AND AFTER OF FACIAL SCANNING

BEFORE

AFTER

I reoriented the app to work in portrait rather than in landscape. This helped target a person's face as well as gave us more screen area for displaying the face and later glasses on the face. This screen for the iPad was a redesign of an existing screen and served, through animation, to progressively educate them on the computer vision processes going on behind the scene. It also helped them learn interesting details about their facial structure.
Core to the redesign:
• Simplified explanatory text to fit time customers wanted to spend reading 
• Changed from a landscape to portrait mode to fit the face more naturally
• Created a bottom navigation bar for ease of use.
• Added further filters on the glasses by tapping on “Lens Features” or “Facial Features”. 
• Other features in the ribbon at the bottom of screen are easily modified

ECOMMERCE EXPERIENCE
The ecommerce experience paired glasses, at scale, on an image of the customer's face which allowed them to move their head from side to side to evaluate fit and look of up to 9 different pairs of glasses at once. The challenge on these screens was to help a customer know how to go between views of the product and of the product on their face. 

DEVELOPMENT OF PRODUCT SPECIFICATIONS
Core to my role was the development of product specifications. I was a good choice for product management due to my prior experience in AR and analytics at Amazon. Indeed I was hired at Ditto by a person there who had worked with me at Amazon. I wrote all of Ditto's product specifications while I contracted to them, working closely with engineering and research.
Back to Top