Visual Search is a feature on the Amazon App driven by machine learning. Prior to my arrival in Palo Alto to lead design for the Visual Search & Mobile Innovation team, the product roadmap for this Amazon business unit was more like a roster of research experiments at a university. Since the department was largely run by post doc researchers who were continuing the work they had done while at the university level, this made sense. But, while the actual tech, machine learning, was world class, this strategy was not putting features in place that the customer was embracing. 
Additionally when I arrived, designers were being ignored. There was only one designer, Sudeshna Pantham, on the team. Luckily she was brilliant, but as someone who could contribute product ideas she was sidelined. 
So I did two things in response to this. First I put a cohesive design strategy together in place which leveraged both analytics and design thinking processes. Fed by data AND direct customer engagements, the research scientists began to listen to the ideas of my team. 
I also hired four more designers with complementary strengths in UX, visual design, computer vision, augmented reality and ideation.
But ideas aren't enough. I had to put solutions into action.
Through ongoing conversations with leadership and my team's ability to communicate ideas and partner with the technical program managers and the developers, I was able to introduce these customer centric features into our road map. 
At a tactical level I directed the redesign of the app to include a clearer region of interest, augmenting the search results by displaying product attributes as tags, assisting the user with alerts and coaching flags that were more easily understood, and the ability to upload photos for scanning. 
These features were launched into the Amazon mobile app and resulted in stronger customer engagement as shown by analytics.
Below is an image of one test designed to use Harris Points (the blue dots) as a targeting tool AND progress indicator. We were also experimenting here with adding in grayed-out tags to help people aim the camera.
Below is the first release of the tagging system.
As you can guess, in order to be successful in my role leading design for the Mobile Innovation Team, I had to quickly develop a good understanding of machine learning. But the effort paid off in being able to influence the design of the computer vision features and work collaboratively with both the research scientists and engineering.
The team I build was also very strong in Design Thinking. As a result of the number of good ideas we had, the head of the business unit began turning to us over the product managers when he needed new features and product ideas to share with leadership at Amazon.
Outcome
I took a design team that was not recognized as have a role in product ideation and built it up to become a recognized product ideation leader in a complex technical field.
I also improved the product roadmap. We integrated a customer focus into how we assessed potential features and developed their requirements. This in turn improved performance of our app with customers and the overall perception of the Visual Search & Mobile Innovation team team by Amazon leadership in Seattle. 
Back to Top