AR View let you view thousands of products in your home before you bought them.
Design challenges included creating an interaction design that made it easy for customers to know how to place, turn and move objects in a three dimensional world from their cell phone. We came up with a system of finger gestures that usability testing showed customers picked up very quickly. You can see the final result of our work in the promotional video below.
The other challenge was to enable a satisfactory product search within the AR view. This a product strategy issue as much as it was an interaction design challenge, because enabling people to search within AR meant that we needed to create robust product sets that would meet customer expectations while also knowing that we could never deliver a full catalog search because we would never have halographic images for every product sold on Amazon.
This release leverages Apple's AR Kit so scaling and placing objects was relatively easy. But prior to AR Kit, we developed another way of placing a holographic object at scale that did not require a marker. It did, though, require the customer to walk in a semi-circle with the object at the center of the radius to be initialized and shown. This was a daunting UX challenge. It was very hard to quickly explain to customers.
I solved the problem by a bit of misdirection. I put a halographic object in the scene that has something written on its side. To read it, people would naturally walk around to see it better. In this way, our customers to walked in the arc we needed. Suddenly objects were easily initialized and placed at scale.
I built this demo prototype of the solution in SketchUP and edited it in iMovie.
The product was a success, garnering praise from the press as well as meeting goals for data validated engagement from the customers.