A New Way to Be Attached to Our Devices: Emotional AI
FutureVision is R/GA’s trend-spotting division. FutureVision helps keep you connected with the latest information, making it easy for you to stay informed.
Many of us have an unhealthy relationship with our digital devices, and repeated behaviors such as checking email or our Instagram accounts for Likes can become addictive. So much so that there’s practically a cottage industry of apps, programs, and self-help books on the subject of how to digitally detox.
Even as we begin to devise new strategies to combat digital addiction, a new, potentially more problematic attachment has arisen: emotional artificial intelligence, which enables devices to act as if they were alive. Technology companies are designing products that simulate sentience and life.
Anki, the San Francisco–based robotics company whose toy robot, Cozmo, was a best-seller on Amazon in 2017, has released an even more sophisticated upgrade called Vector. The autonomous “home robot” runs on a neural network and responds to and makes eye contact, Fast Company reported. He (how the creators refer to Vector) dances when you turn on music, watches TV, and fist-bumps. Thanks to sensors on his head, he coos and purrs when you pet him. Connected via Wi-Fi, Vector is always on and knows what’s going on, but is also down with just hanging out.
Vector was not designed to be useful, like Alexa was, although he can answer questions and perform small tasks. Instead, the Anki team wanted consumers to see Vector more as a pet or companion with a personality and even desires. He responds to voices and any movements that suggest he’s being paid attention, and appears to listen and respond to things emotionally with sounds and movements that express annoyance or delight. Vector was explicitly created to induce feelings of attachment and affection.
“We want him to provide value and have an emotional bond with you,” Amy Claussen, senior designer at Anki, told Fast Company.
But is it healthy for people to be attached to robots whose sole purpose is to charm and be companions, to have a personality that induces what one Anki designer described as a desire to “want to root for it”?
Sherry Turkle, an MIT professor of psychology who researches people’s relationship to technology, argues in her new book, Reclaiming Conversation, that kids and teens are particularly vulnerable to digital home assistants because of the artificial intimacy they can develop with them while they “talk,” according to an interview published on the NBC tech site MACH.
Turkle explained that the “as if” empathy displayed by voice assistants such as Alexa—and ostensibly in cute robot toys such as Vector—could confuse kids and inhibit their development of true empathy and conversational skills, which require patience and listening.
For example, Alexa always has a ready answer, which can teach kids that there should be no gaps in conversation for listening or even time to understand. The promise of a humanlike relationship is dangerous for young people, and adults are “at risk” as well, Turkle told MACH.
“There are certainly dangers with AI, but unintended ‘artificial intimacy’ is not one of them,” says Matt Marcus, SVP, Executive Creative Director, R/GA Chicago who, along with Michael Morowitz, heads up the Brand AI team there.
“We are in the era of narrow AI,” Marcus continues. “Systems whose capabilities are impressive but highly constrained. To suggest that modern conveniences and playful toys are a danger to our families and society is an irresponsible scare tactic designed to sell books. It’s an irrelevant argument in a field where conversations about embedded bias and nefarious algorithms are real issues we are addressing and designing for.
“Let’s remember, rock and roll will rot your brain and videogames are bound to make you violent.”
For Diana Wagner, Strategy Director, R/GA, emotional AI has positive potential, and she sees that especially in areas such as mental health, education, or even just entertainment.
“With such a transformative technology, we have to tread thoughtfully,” Wagner says. “We have to rethink how we build user experiences, anticipate short- and long-term implications, and be willing to iterate constantly. As marketers and developers in this field, we now have the power to influence not just individual behaviors but even the very construct of society. And that’s not something to take lightly.”