Artificial Intelligence, or “AI”, has started appearing in our day-to-day lives. We’re chatting with computers; personal shoppers live inside our phones and speakers. Image-based AI capabilities are slowly finding application in the real world.
Industries are being swallowed by AI; Gartner even estimates that chatbots will replace human customer service reps in 85% of customer interactions by 2020. That’s next year. At this speed, AI could transform the retail landscape at every stage in the cycle, from product manufacture to sales. Not only does it have the power to drive down costs for businesses, but it could change what it means to shop.
But what IS artificial intelligence?
This is an important question, because “AI” is famously tough to define. For that reason, in newspapers designed for non-tech audience, you can sometimes read about products which aren’t really AI described as artificial intelligence.
Here’s a more technical definition: to qualify as AI, a software application has to use at least one “machine learning” technique. At its very core, AI begins with data. You teach the AI with examples – for an image-based AI bot, you might feed it ten thousand pictures, half of which have a giraffe in, tagged as “giraffe” or “not a giraffe”. Then, it could identify a picture of a giraffe. This is much easier than trying to manually code what a giraffe “looks like” to a machine, which has never been done successfully.
AI of this kind is a recent invention. For example, the bot “Deep Blue” which beat the world champion Garry Kasparov at chess in 1999 wouldn’t qualify as AI under this definition because it was using “brute force” – it knew all the possible outcomes and was selecting the better choice after each move Kasparov made. Whereas “AlphaGo” by DeepMind, the bot which beat world Go champion Lee Sedol in 2016, would qualify as AI, because it was using a different technique. Go is too complex to know every possible move and outcome, even for a powerful computer. Instead, AlphaGo played itself millions of times, spotted patterns in the data, and used those patterns to take a superior game to the world champion. It used patterns in data in 2016 rather than rules in a closed system in 1999. That’s seventeen years of progress. And it makes all the difference – because the world is too complex to ever model, so we rely on patterns all the time.
This also changes the way we can interact with computers. AI bots understand what’s called “unstructured” data.
Previously, machines could only digest “structured” data. Structured data is data for which the meaning is clearly tagged – for example, software might present you with a menu, and you can click on a link. That click is a datum which has a pre-coded meaning to the machine; like “open microsoft word”, or “turn off”.
Unstructured data, by contrast, can be presented to a bot in any way; so long as the bot has been trained on similar data it should be able to infer what to do with it. Unstructured data types include footage, images, voice, and language – so if a tool is messing with images, such as FaceApp, or interpreting voice, such as Alexa, there’s a good chance it’s AI.
Alternatively, some people believe that “AI” refers to computer intelligence which mimics humans in some substantial way. This can also be true – but remember, the most interesting “intelligence” it’s possible to create with machines is not similar to human intelligence.
So how is AI being used in retail and hospitality?
1. Tailored recommendations online
In need of the perfect winter coat? Don’t know where to look? In 2016, outdoor clothing shop The North Face partnered with IBM’s Watson analytics to create an app that acts as your Expert Personal Shopper (XPS). The tool helps sift through over 350 jacket options on the site, and then makes a choice based on your specific needs. Watson, the AI supercomputer owned by IBM, operates the app assistant, who asks questions to find out exactly the right gear for you. Here’s their ad.
When and where will you be using the jacket? All this Q&A data collected will be useful in other marketing efforts like personalized emails. But will this smart algorithm be as convincing as a human personal stylist? Better stick to gameshows, Watson.
2. Virtual changing rooms
Gone are the days where you’re battling frustrated shoppers for the last free fitting room. With virtual mannequins, customers can see how they look in a variety of styles, and might not need to visit a shop’s changing room.
In store, virtual changing rooms will reduce costs for retailers. This will skirt the theft risks and high staffing costs of real fitting rooms. However, adding “depth” to the online buying experience might prompt more shoppers to stay home and order online.
Augmented Reality dressing rooms work by superimposing the 3D model or picture of the clothing within the live video feed of the customer. They are generally underpinned by facial recognition technology which relies on algorithms powered by machine learning to work. Amazon is developing an app which will create your mannequin by using social media pics. It would mine data stored on phones and computers to generate an AR image of the customer wearing the stock. It also uses data like calendars (to show the person’s appointments and job requirements) to suggest outfits for upcoming events. Which seems like a more intimate relationship than some people would want with their bot!
I doubt that dress shopping via webcam will match the fun of IRL anytime soon.
3. Robots helping customers
Stores can use chatbots to provide customer support, both in store and online. Created in 2014 by the Japanese company, SoftBank, the robot Pepper – a humanoid who can chat to customers – is now being used by 2000 companies globally. When Pepper was trialed in 2016 at B8ta, a tech retail shop in Palo Alto, SoftBank claimed that the store had a 70% increase in foot traffic.
But are punters only flocking to the store for the novelty? As cute as Pepper sounds, at least for now, there’s little evidence to suggest that the robot is making more money than he costs. Googly-eyed Marty, another autonomous bot, can already be seen roaming aisles in the US. Marty detects store spillages but doesn’t clean them up. More like a burden than a benefit, I’d say.
It’s also important to say, these are AI in the tradition that they’re intelligent machines which act like humans; not pattern-spotting data crunching software. But robots could creep into more and more shops soon.
4. Voice-powered grocery shopping
“As a busy working Mum, this makes life easier”. What are people saying about Ocado skill? This app for Amazon’s Alexa manages online shopping lists via voice command. Some users reveal just how much AI is revolutionizing how we shop.
Ocado skill tells customers which products are in season and makes suggestions based on their shopping history. But you can’t create new orders on it yet!
It’s only for adding products to an existing basket. Other reviewers suggest that the app has its limits. Some say they wish it was “snappier and less chatty” and that it “needs some work, gets stuck in loop talking to itself!”
This voice interface hasn’t overhauled shopping habits so far. Morrisons reported that only 1% of its shoppers are now using its Alexa-enabled voice-ordering service, which launched back in 2017.
5. Smoother logistics with algorithms
In the case of bad weather or slippery roads, AI systems can reroute shipments to avoid delays. For example a new UPS machine learning app – Network Planning Tools – can detect and clear up bottlenecks based on cost-and-benefits analysis.
If the app sees a storm on the way, it can reroute packages away from trouble. This feature is very useful for UPS, which operates by road, rail and air. By automating these processes using advanced algorithms, companies can ensure they get christmas presents delivered on time.
6. Facial recognition to reduce shoplifting
With the damages done by retail crime ever rising, AI can help businesses spot the bad guy. Facewatch, a facial recognition system that watches people walking into stores, is currently being rolled out in a Budgens in Aylesbury, England. Cameras scan faces of customers against a record of known people who have been caught on CCTV shoplifting or abusing staff before. If they get a match, the retailer gets a text.
But the tech is racked with problems. Civil liberties groups like Liberty have highlighted threats to privacy. When it comes to facial data, the deal around who gets hold of it and for what purpose are much more sensitive. In some cases, people are asking it justified or even legal? This month, King’s Cross developers revealed that they were using their controversial facial recognition CCTV in the area for security reasons. They’re now working with the ICO to make sure their future face-matching systems are legal. Even if the cams hadn’t been used since March 2018, it’s still pretty dodgy that this has been kept a secret.
AI is changing the retail game. Especially as we move away from brick-and-mortar and towards mobile shopping. Retailers now “know” more about our choices. With Amazon now a major player in the grocery market, and having just bought Whole Foods, retailers are being forced to rethink their strategies (online and IRL) to stay competitive.
Want to stay ahead of other stores? 💅
Get the newsletter. Once per month, for free