Amazon Go: Artificial Intelligence and the in-store experience5 min readReading Time: 4 minutes
You could say, Amazon Go in a way, almost broke tech media’s internet this past December. The words: driverless shopping, checkout-free shopping, almost ‘shoplifting’, deep learning, computer vision and job killer AI were a few that made the rounds. Taking a step back, one does wonder why this is surprising, and how we all missed this entirely. The signs have been there all along.
The online giant realized earlier than anyone else that being the fastest shipping service in the country was a massive value add to shoppers. Same-day, same hour deliveries became the norm. The site became the go-to destination for product discovery, with greater than 50% of all product searches starting on Amazon, not Google. With an audience base that loyal and demanding, it became evident that product discovery was never going to stay the ‘Google way’ moving forward.
The company made headlines again a few months back when it announced that it had a warehouse within 20 miles of a whopping 44% of the US population. The number, the article noted, had grown from 28% (‘14)— 36% (‘15) — 44%!
The hue and cry across the interwebs about its zombie warehouse treatment of its employees, hardly took away from the company blazing ahead with building out highly optimised, automated work flows at its fulfilling centers. Earlier in June this year, Bloomberg writers Kim Bhasin and Patrick Clark wrote “2012 Jeff Bezos scooped up warehouse automation firm Kiva. Everyone else is still trying to catch up.” The company has steadily been scaling its Kiva robot workforce to almost 30,000 today.
This year, Amazon picked a favorite at its Robotic Picking challenge, scouting for robotic arms that do the best picking off the shelf and putting products back on. The pieces have been falling in place systematically for a couple of years. The robotic ‘store of the future’, now seems like a natural progression.
Through all of this, the company has shown immense focus on the consumer experience. It has shown tech and product prowess with Alexa, Dash and more. The Go story seems like a story of convenience, of time saved. With its entry into the home space, Amazon is certainly cementing itself as the shop of the new American household, extending convenience from the couch, all the way to the store.
“Where Walmart helped Americans save money, Amazon obsessively saves people time”
The experience on the store front will not be without its challenges. Equipped with sensors everywhere (pressure sensors, RFIDs and more), automatic ID recognition at the entrance, instant sensing of the products taken off the shelf or put back, it seems like an activity-flow designed around the same principles the company designed its warehouses, but the moving parts seem aplenty. Most companies building AI struggle with scale – AI encountering new situations and objects can fail extensively. The end user experience will determine the nature of these failures, the emotional price a user pays to interact with the different moving technical parts of the store, and the value derived from trade offs made. For a successful launch and sustainability of this robotic store model, there is an urgent need for a dogged focus on making things simple, easy and instantly available to shoppers, on scale.
It’s the same reason technologies like computer vision and facial recognition, which have been struggling to take off in the mainstream industry for decades now, will find a way to become part of a new habit. Let’s not kid ourselves, much of this store model will be about habit change for shoppers – and we’ll be happy to oblige despite some initial groaning. We might willingly walk into that store knowing our faces are being recognized but do it because it’s Amazon, and we’ll say something like “they know pretty much everything about us anyway”! It is surprising how much privacy we’ve been known to trade off in recent years, in return for convenience, a sense of community, or even out of a sense of fear of missing out (FOMO). The online to offline personalization cycle will be complete with computer vision and a world of sensors these stores will be fitted with.
We’ll swipe our phones as we walk in like we do at our train stations, take pics of items on the shelves to find similar items online (and unavailable in the store), try on clothes in front of interactive mirrors to see fit and shop for other stuff designed just for us, walk through aisles designed to make us navigate the store in particular ways.
Artificial intelligence, sensing and automation tech will certainly disrupt the retail industry, change the way we shop, create new consumer habits and possibly even get us back in the store. Maybe we’ll even see premium stores with luxury brands and AI led experiences, maybe we’ll see discount stores with ability to sell and buy in the same space seamlessly with tech. The unfolding of the Amazon robotic future is one to closely monitor and track. The change is coming at us faster than we can stop and think, making the here and now, a great time to indulge in consciously designing and directing this change. The only reality to stick here, it seems, is one whose possibilities are endless.