Walmart is about to use artificial intelligence in the worst way possible. According to a patent filing, the largest brick-and-mortar retailer in the world (likely looking for ways to compete with Amazon) is developing a technology that can identify whether customers are unhappy or frustrated. It will likely use existing security and checkout cameras to read the faces.
As we all know, this is the year of machine learning and automation. Companies are starting to use the “AI first” mantra, and it’s transforming industries as we speak. Of course, all of those wonderful innovations will save us time and could even save lives (a car can swerve out of the way faster when a computer takes control than any young driver can). But they could also come at a high price — namely, in reducing our own privacy and in creating a dystopian society.
Why do I think this will be an episode of Superstore soon?
Imagine how it will work. Glenn, the store manager on the show (played by Kids in the Hall alumnus Mark McKinney, above right), is a little unhinged already. One day, a package arrives in his office from the corporate overlords — it’s a new webcam to install at the checkout lane. He’s pretty excited. The camera sends him a text every time the facial recognition detects a customer who looks sad. How could that go wrong?
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
In real life, I could see Walmart employees appearing out of nowhere every time a teenager gets a text from his girlfriend or a dad running on fumes with four kids in tow has to buy diapers. It’s invasive, annoying, prone to errors, not that helpful, and a bit too much like Big Brother with a new toy. Facial recognition is a great idea when it comes to logging into a laptop or passing through airport security a bit faster; it’s annoying when it replaces actual human empathy.
And yet — it’s also inevitable. When we can use facial recognition to assist in the sales process we will, even if it might seem heavy-handed or creepy. As one writer pointed out, the technology is one way to combat customer churn. One bad experience at Walmart means a customer might spend the rest of their days shopping at Costco instead. Also, this is a dogfight. Retailers are in a slump, so there’s no question they will look for ways to make sure customers are happy and never frustrated. I could see Walmart eventually using other automations to tell if you are only in a store for a short time or browsed only for trinkets.
This is not something Amazon has to worry about online. All of the automations — like showing me books that match my interests — seem innocuous or even helpful. I’m being “scanned” just as much online and fed customized information. Yet in person, it seems like someone is watching me and pretending to know my intentions. That’s just not right.
The worst part about scanning for “unhappy” customers is that AI is really terrible at seeing human emotions. Honestly, humans are really bad at seeing human emotions. How do we create an algorithm that knows the difference between a sleepy dad and a depressed dad? What level of software automation is required to identify a teen who just played Call of Duty for 10 hours straight versus a teen who just went through a breakup? I’m guessing this kind of AI is at least 20 years away if not even more — maybe 50 years. Before Walmart starts scanning us, it has two big hurdles. First, it needs to figure out how to scan that box of cereal correctly. Then, it needs to figure out the myriad of human emotions on display.
Sadly, I don’t think any of that will stop Glenn from asking if we’re sad.