Management/Strategy Consultant | Hackernoon’s “AI writer of the Year” | Editor of ThePourquoiPas.com
Stores are changing. We see it happening before our eyes, even if we don’t always realize it. Little by little, they are becoming just one extra step in an increasingly complex customer journey. Thanks to digitalisation and retail automation, the store is no longer an end in itself, but a mean of serving the needs of the brand at large. The quality of the experience, a feeling of belonging and recognition, the comfort of the purchase… all these parameters now matter as much as sales per square meter, and must therefore submit themselves to the optimizations prescribed by Data Science and its “intelligent algorithms” (aka artificial Intelligence in the form of machine learning and deep learning).
The use of artificial intelligence is, above all, a competitive necessity. Indeed, e-commerce players did not wait on anyone: note, for example, the adaptation of online search results to the end customer, or the recommendations made based on a digital profile. These two aspects are impossible for brick and mortar (for now). However, physical commerce has its own strengths. Olfactory, visual, auditory, etc. data can be used to give the consumer a feeling of having experienced something unique and made specifically for them. In addition to customer relationship improvements, artificial intelligence also makes it possible to seek the resolution of problems that have long represented a burden for retailers : better inventory management, optimization of store space, optimization of employee time…
We present below a complete look at deep learning/ machine learning use cases implemented to create the store of the future, supported by real-life examples.
It’s a known fact that e-commerce actors can optimize their websites in real time using dynamic statistics. This allows them to define the most effective strategies according to the resources available and predefined customers segmentation. Like any physical space, the store does not have this luxury.
However, this does not prevent the periodic optimization of physical spaces, thanks to insights gleaned from intelligent algorithms. Back in the days (less than 20 years ago), we’d hire students to follow and count customers in specific areas of the store. Thankfully, these times are over. Heat-maps, average route diagrams, time spent on screens, various ratios in relation to total attendance, correlations… the cameras in store and computer vision algorithms now provide actionable tools based on images. Today, heat-mapping and activity recognition solutions help not only to position promotions, but also to create entire marketing strategies, and to measure the performance of each department, as well as that of product placements. Solutions offered by the likes of RetailFlux can analyze store videos to give retailers data on the number of people in their store, the path they take once inside and where they linger. This helps marketers identify popular locations, allowing them to change the layout of furnishings, displays, advertising or staff to better serve their customers and increase revenue.
As technologies evolve, we are also starting to hear about “demographic recognition”: these tools, created by start-ups such as DeepVision AI, MyStore-e, RetailDeep and RetailNext, allow us to estimate the age and gender of people passing in front of a camera, thus giving stores access to a whole new granularity of analyzes. This aspect is paramount to the rationalisation now expected of marketers and category managers.
Although these cameras are often hung from the ceiling, this is not always the case: Walgreens (in partnership with Cooler Screens), for example, recently integrated cameras, sensors and digital screens in the doors of its stores’ coolers to create a network of “smart” displays that brands can use to target ads to specific types of customers. The doors act as a digital merchandising platform that depicts food and beverages in their best light, but also as an in-store billboard that can show ads to consumers who are approaching, based on variables such as approximate age, gender and current weather. Cameras and sensors inside the connected coolers can also determine which items buyers have picked up or viewed, giving advertisers insight into how their promotions work on the screen, and quickly notifying a retailer if a product is no longer in stock.
The key question thus shifts from “where” and “how many” to “who”, “when”, “how often”, “how long” and “for how many cookies?”.
These data, mixed with those from check-outs and loyalty programs, are key to forecasting demand and creating store clusters, which in turn improves retailers’ supply chains. By better predicting what products will do well in a certain area, machine learning algorithms from startups such as Symphony RetailAI can reduce dead stock, help optimise pricing (and profits), and increase customer loyalty (people obviously tend to enjoy finding the right product mix in their nearest store).
Indeed, unsold stock might be one of the retail industry’s biggest handicaps: unused inventory costs U.S retailers about $50 billion a year. Reducing this number is key to the industry’s long-term survival : every dollar spent on what becomes dead inventory is valuable money that could have been put towards training talent, better R&D, or, most obviously, brand new smart algorithms.
Forecasting also helps retailers optimisise their promotions : the less dead stocks are in the warehouse, the more strategic promotions can be, instead of being merely reactionary. Many pricing aficionados will particularly appreciate this aspect, as it will make their job a lot easier, and a lot less thankless.
In the same way that a website can adapt in real time to end users, an increased granularity of computer vision is also possible in stores, allowing it to target individuals. However, these algorithms are based on more elements than the ones presented above, and are thus more complex/less reliable. To work at a personal level, the algorithms need a mix of demographic recognition, loyalty code identification, and augmented reality, often integrated into smart objects such as mirrors.
Although they cannot (yet) be implemented on a large scale, these solutions exemplify a profound change in the way stores sell. We are moving from the sale of product to the sale of experiences, where the physical offer becomes a by-product. This is the concept of shoppertainment. Low prices and an extensive catalog are no longer enough for customers, who can find such a value proposition online. An authentic brand experience becomes key to survival : the store is a storehouse of engaging experiences, ideas and interactions.
The use cases are of course numerous (even if they often border on the sci-fi technobabble side of the AI equator): during 2019’s NRF, Google presented a connected mirror which links visual recognition data and the stores product database. In the case of an optical store for example, the mirror can recognize the model tested and display product or marketing information concerning it. The sellers also have statistics on the use of the mirror in real time: they know that the person who has tried a certain type of glasses has been there for some time or hesitates between two pairs. This facilitates the work of the seller who can thus advise the customer on the products which really interest them.
H&M has for its part allied itself with Microsoft to test a mirror allowing to take selfies thanks to voice commands, while Lululemon’s mirror acts more like a board which encourages its customers to engage with the community created and maintained by the brand.
Smart mirrors can of course be placed at different intervals of the purchasing process: Ralph Lauren’s is located in the fitting room to transform the often frustrating experience of trying out clothes. Buyers can interact with the mirror to change the lighting in their fitting room and can select different sizes or colors for their outfits, which an employee will get. The mirror also recommends other items that would go well with what is being tried.
Cosmetic companies have also adopted these solutions: the Sephora smart mirror uses an intelligent algorithm which mixes the gender, age, appearance and style of the person looking at it in order to make recommendations. It even claims to differentiate between people wearing neutral or bright colors, daring or conservative styles and clothes with floral and geometric patterns to name a few.
Through deep learning, we are also seeing a new technique emerge : affective computing. It is the ability of computers to recognize, interpret, and possibly stimulate emotions. It is indeed possible to identify gestures such as head and body movements, while a voice’s tone can also speak volumes about an individual’s emotional state. These insights can be used in the store so as not to generate an inconvenience for a customer who clearly does not need to be helped or bothered. These technologies are nevertheless new (only Releyeble offers retail use cases) and intrusive: it is therefore preferable not to comment yet on future use cases.
Mirrors, augmented reality, virtual reality … they rarely respond to real pain points for retailers and their customers. And we know these pain points by heart : checkout length, quick product localization and inventory management… those should be priorities for stores looking for ways to use machine learning and deep learning solutions.
In China, for example, customers of certain KFCs can, thanks to Alipay technology, make a purchase by placing themselves in front of a POS equipped with cameras, after having linked an image of their face to a digital payment system or bank account. The American chain Caliburger has also tested the idea of facial recognition in some of its restaurants: the first time customers order using in-stores kiosks, they are invited to link their faces to their account using (NEC’s) NeoFace’s facial recognition software in order to benefit from numerous advantages. Payment by bank card is still necessary, but the company intends to switch to payment by facial recognition if the initial test phase is successful.
Fears over cybersecurity could however prevent this kind of solution from seeing the light of day on a large scale. Indeed, customers are more and more jealous of their personal data (and rightly so): according to a Wavestone study, only 11% of consumers are ready to submit to facial recognition in stores. For recognition by mobile application, this figure rises to 40%.
Other, more viable, ways to use computer vision to make checkout more fluid are therefore being considered. We are by now all familiar with Amazon Go’s automated stores (not too familiar, one hopes), which allow customers with a Prime account to enter the store with a code on their phones, do their shopping, and exit the store without going through a checkout. An algorithm having “followed” the customer around, the amount of purchases is automatically debited, and an invoice is sent by email. Testing of this technology is also underway at Casino, in partnership with XXII.
There are many start-ups in this space: Standard Cognition, Zippin, Trigo Vision… all claim to help companies eliminate the checkout of customers. China, meanwhile, is casually reworking the very concept of the store through the Bingo-Box by Auchan.
All these cameras can be used to see more than customers: many solutions for monitoring shelves have indeed emerged. They offer to send an alert to employees in the event of a shortage, which allows for a prompt response.
This is key for stores: stockouts represent more than $129 billion in lost sales in North America each year (~4% of revenues). Not only that, but stock-outs can also actively drive customers into the arms of competition: 24% of Amazon’s revenue comes from customers who have experienced a stock-out at a local retailer. There are many examples of such solutions: in France, Angus AI works with Les Mousquetaires. In the US, Walmart has been working on this concept since last year, as has ABInbev with Focal Systems. Interestingly, Yoobic’s solution offers a similar process, but the camera is in the hands of individuals in order to take the photos that will be analyzed by the algorithms. In China, meanwhile, Hema (Alibaba’s store of the future) is pushing the border of augmented stores more than anywhere else in the world.
Of course, images aren’t the only things that can be analyzed in store; voice also has a role to play in streamlining customer journeys. This under-appreciated method of shopping is due for a small revolution: 13% of all households in the United States owned a smart speaker in 2017, per OC&C Strategy Consultants. That number is predicted to rise to 55% by 2022.
The fact that Amazon is also a leader in voice technologies shows how serious the Seattle giant is in terms of its brick-and-mortar domination (having already conquered virtual spaces). The brand’s Echo Buds, launched in 2019 work with Alexa to answer any questions it understands while a customer is on the move. More interestingly for retail, it also informs the user if the closest Whole Foods (Amazon owns Whole Foods) has an item a customer is looking for. Once they are informed and in the store, the Echo Buds can direct them to the right aisle. You can imagine Alexa not only guiding you to an item, but if you tell it that you want to make lasagna, it could also guide you through a store, giving you the quickest way to pick up all the necessary ingredients. The future is ear (get it?).
Virtual assistants are indeed on the rise. The Mars Agency, for example, has partnered with American retailer BevMo! to test SmartAisle, a digital whiskey purchase assistant. By mixing artificial intelligence, voice-activated technology and LED lights on the shelves, SmartAisle helps buyers choose the perfect whiskey bottle. Three bottles are recommended after a quick conversation, and the relevant shelves light up to lead the customer to the preferred bottles. If customers already have a brand in mind, the assistant can recommend other brands or bottles with similar flavor profiles. The whole experience lasts no more than 2 minutes. The voice assistant makes it a pleasant and informative experience, with a mix of banter and useful information.
From NLP to virtual assistants, the two examples above show that, if used well, Voice technology can free more employee time, and give key data to retailers.
The discussion on improving and streamlining processes would not be complete without a discussion around robotics. These objects, long relegated to science fiction, are now showing their usefulness in stores around the world. Although robotics is not in itself a subcategory of artificial intelligence, robots roaming the aisles use notions of computer vision and NLP. Just like Amazon, Walmart is here too at the cutting edge of technology: Bossa Nova robots (called “Auto-S”), which are designed to scan items on the shelves to help with price accuracy and restocking, are already present in 1000 of their stores. These six feet tall devices contain 15 cameras each, which scan shelves and send alerts to employees in real time. This frees workers from the need to focus on repeatable, predictable and manual tasks, giving them time to focus more on sales and customer service.
Walmart has also introduced robots that clean floors, unload and sort items from trucks and pick up orders in stores. It is interesting to note that this niche is quickly becoming highly competitive, Simbe’s robots have been deployed in Schnuck store across America, with the same value proposition as Bossa Nova, while Lowe’s unveiled in 2016 a robot that can understand and respond to simple customer questions. Post-coronavirus, it is almost certain that the movement towards robotics will accelerate in the coming months.
“Shrinkage” (theft) has an enormous cost: €49 billion per year on a European scale (2.1% of annual turnover in the distribution sector), weighing heavy on the margins of distributors already highly pressurized by price wars. Security therefore becomes a pressing need. And because of costs, so does automation. This can take many forms. Augmented cameras, for example, can identify if a product has been hidden, and alert a human. This would, however, produce a lot of false positives due to the physical impossibility of an all-knowing camera. Companies such as Vaak or DeepCam AI claim to be able to avoid this problem by alerting someone only if the behavior of a visitor is highly suspicious. Solutions such as StopLift also offer to detect “sweethearting” (an employee pretending to make a transaction, but in fact giving a product to an acquaintance without payment). It is important to remember that a large percentage of store thefts go through employees. The ROI of these solutions is easy to calculate: stores know exactly how much they lose from theft and errors. As such, this use case is likely to be one of the first to be implemented.
In view of all these developments, and despite their many positives for both retailers and customers, it is essential that customers question retailers about who has access to data and how it is used. It goes without saying that transparency must be the watchword of any use of personal data in order to guarantee consumers the preservation of their private life.
If you’re eager to get going with your very own corporate A.I project, I recommend jumping straight to my latest article on the matter : 10 Steps to your very own Corporate Artificial Intelligence project.