Thoughts on AI: Is AI coming to your emotional rescue?
The case for emotional AI in retail and beyond
With the recent closures of major retail stores around the United States, there’s likely a ghost mall near you.
According to real estate data firm Co-Star, over 90 million square feet of retail space is slated to close this year, leading observers to point to an obvious truth: empathy matters in customer service. Getting it right is another story.
When businesses are out of touch with consumer needs, consumers stop buying and stores start dying.
Enter "affective computing," an area of research involving machines that can read and display emotional intelligence, with applications as far ranging from preventative medicine to music lessons and every commercial sector in between.
The retail industry isn't the only one eying “emotion AI” as a potential savior from digital disruption, but the physical spaces that characterize the retail experience ares providing innovators with a ripe venue to demonstrate the power that capturing and understanding customer sentiment can have.
That’s especially true when paired with physical devices, such as Internet on Demand (IoD).
MIT professor Dr. Rosalind Wright Picard first coined the idea of machines that could better interact with humans in her 1997 work, "Affective Computing," which detailed the potential impact of emotionally sensitive cognitive systems. This launched a field of research to grow along with the development of artificial intelligence and cognitive systems.
According to Dr. Picard, if we want computer systems to interact intelligently and authentically with us, they need a discernable emotional IQ. They also need an ability to recognize emotions, understand their nuances and even to be able to emulate emotions themselves.
Emotion AI is an area that's expected to grow from $12.20 billion just two years ago to a whopping $53.98 billion by 2021, at a compound annual growth rate (CAGR) of 34.7 percent, according to research group Markets and Markets.
It’s no wonder enterprises and startups are embracing machine learning and AI. According to a 2018 survey of leading business and technology executives from a variety of industries, AI is already having a major impact in just about every role and function, with 61 percent of respondents reporting they are implementing AI and have a budget to match.
And why stop with machine learning? The addition of computer vision in cameras, self-driving cars and other devices, means artificial systems can now recognize faces, navigate and read objects. These inventions will help AI overcome the barriers that separate algorithms that, for example, merely recommend purchases, to those that can attempt to build relationships with humans thanks to their abilities to hone in on context and intention.
"In a future where we're going to be interacting with machines all the time, from the buildings that we sit in to the kiosks and punch screens we use when we place our order in restaurants, what Dr. Picard has picked up is a movement that allows us to bring emotional awareness into computing," said Jag Minhas, founder of Sensing Feeling. His London-based startup develops new IoT technology that helps companies improve their products and services by attempting to authentically gauge people’s reactions to a brand's retail experience.
Minhas points to the emotional connection most people have with consumer technology brands such as Apple dating back to the 1980s. His company's website details a timeline of a digital revolution starting from the Internet as a publishing platform in the early 1990s, trending towards a future “feeling” internet within a decade.
While it may be another decade before physical spaces manage to discreetly monitor and respond to human feelings, movements and gestures, ideally for consumer benefit, early signs of emotion AI can be found in the virtual personal assistants people use every day.
One can look to startups such as Soul Machines to get a glimpse of what's next. The New Zealand-based startup has set its efforts on putting a "human face" on AI with interactive artificial humans built using a combination of neural networks and brain models powered by IBM Watson and IBM Cloud.
In February, Soul Machines announced it had partnered with Daimler Financial Services to demo Sarah, a digital human designed to improve their customers’ experiences with car financing, leasing and insurance. Sarah features artificially generated empathetic facial gestures and a natural voice intonation that not only feels more humanlike but will eventually recognize nonverbal behavior in real time using face recognition.
Could this development sound the death knell for customer service chatbots customers have become accustomed to? Not necessarily, says Andrei Faji, director of marketing and customer experience for Austin, Texas-based WayBlazer, another company that cut its teeth using IBM Watson technology to personalize travel recommendations by analyzing customer intent and context. The answer to improving customer service isn't necessarily to slap a human face on a machine.
Faji argues more for contextual awareness over emotional awareness, mostly due to the difficulty of incorporating emotional awareness, not only within an algorithm, but in customer service at large. Contextual awareness can help algorithms tell the difference between a search for a romantic getaway for a 30th wedding anniversary or a weekend destination for a dating couple.
Faji cites the travel industry, which has invested heavily in chatbots to help customers find their lost bags, get refunds or figure out the late check-out policy.
“In any situation where a guest is in an emotional state, we’ve seen a decrease in usage of chat bots because they understand it won’t be an effective exchange and there’s going to be a lack of empathy there,” he says
At the same time, notes Faji, research shows that chatbots that are overly personified elicit negative reactions from users than chatbots that are a bit more utilitarian. People can easily pick up on on the inauthenticity of the manufactured response,and thus judge them, reject them and even sometimes abuse them.
So what’s a brand eager to create the ultimate technology lapdog to do?
Faji offers this advice for companies looking to start applying emotional intelligence or contextual awareness: dial it up or down based on the users’ needs and the use case.
First, start with the brand’s identity. Understand who your customers are and what they want. For example, a bot or digital assistant designed for platinum card travelers in Russia shouldn’t look, act or sound the same as the bot built for an economy, no-frills airline.
“If it’s not designed to help the user make things faster, easier and more intuitive, then it’s a novelty and the user will probably not want to adopt a new paradigm,” he said.
Learn more about the IBM Data Science Experience.