Robots Are Learning to Fake Empathy
Military-funded research has developed artificial intelligence that can read and respond to human emotion.
SimSensei working with a subject. Image: ICT
Emotional intelligence is a cornerstone of human interactions—an essential part of what it means to be human. But now, artificial intelligences are being developed to better read and process human emotions, which is already changing the way we interact with robots.
In the early 1990s, psychologists Salovey and Mayer were the first to recognize emotional intelligence as a set of knowledge and skills distinct from other forms of intelligence, defining it as "the ability to monitor one's own and other's feelings and emotions, to discriminate among them, and to use this information to guide one's thinking and actions." Emotional intelligence is something that seems wonderfully and innately human.
But it turns out the tenets of emotional intelligence—which we start picking up in infancy and which seem so closely linked to human nature itself—can be quantified and reduced to logical procedures and algorithms. We humans are not so special, after all.
Developers at the University of Southern California Institute for Creative Technologies (ICT) have turned their attention to adding emotional intelligence to the AI they install in their virtual agents—animated, human-like interfaces that engage a user in conversation. The result is "empathic" virtual agents that can read, understand, and respond to human behavior.
I spoke to Albert "Skip" Rizzo, a psychologist and director of medical virtual reality at ICT, about what's going on at the Institute. He told me all about ICT's latest development in artificial emotional intelligence, a congenial counsellor called Ellie. Ellie evolved from the ICT's early work on virtual human agents, which started in 1999.
ICT also does work on using virtual reality to treat PTSD.
"A lot of our funding has come from the military in order to build virtual humans they can use for training purposes," Skip told me. "So for example, we developed applications that can train someone to negotiate with an Afghan warlord, or to understand cultural sensitivities and to interact with somebody from another country, particularly Iraq or Afghanistan."
Recognizing a connection between negotiation skills and therapy, the team at ICT decided to leverage that early technology to put virtual agents in a civilian clinical setting. "The idea was to create virtual patients, so that a training clinician could mess up with a bunch of virtual characters before they got their hands on a live one," Skip said.
"What we found was really surprising. In spite of the fact that the character didn't look that real—she was pretty rigid and so forth—when medical school psychiatry residents interviewed her, once they got a couple of good answers from her all of a sudden they became engaged in the process of interviewing just as if it was a real person," Skip explained.
"So we learned from that, that the appearance of a character is less important than the level of interaction. And therein is the kernel of the whole thing about AI."
That idea is at the heart of the ICT's DARPA-funded SimSensei project. SimSensei is the new generation of AI: virtual agents that display high levels of artificial emotional intelligence and can engage convincingly in back-and-forth interactions with people. Ellie is the star of the SimSensei project; she's a virtual therapist that, in some studies, has performed better at clinical engagement than her human counterparts.
At this stage, Ellie mostly sees military personnel who have recently returned from deployment and might be suffering from post-traumatic stress disorder (PTSD). Ellie engages users in a private face-to-face interview using natural language and active listening. Yes, listening: Ellie is equipped with eyes and ears. "We use a webcam to track facial expression and head pose, we use Microsoft Kinect to track gesture and body posture and we use a microphone to capture vocal parameters—not what you say but how you say it," Skip told me.
A session with Ellie starts with some background questions to build rapport. Then she asks users (casually and conversationally) about typical symptoms of psychological distress. She monitors and analyses user responses and reacts accordingly with appropriate follow-up questions and empathetic nods. She even has a convincing grimace in her repertoire.
So far Ellie has been received amazingly well, and according to Skip, users seem to prefer talking with her over a human therapist. In one trial, subjects were separated into two groups: one group was told that a person was controlling Ellie, which Skip called the "Wizard of Oz" scenario. The other group was told Ellie was fully computerized. Both groups knew their interview would ultimately be analyzed by researchers. But still, those who thought Ellie was fully computerized revealed more negative emotions and reported more indicators of PTSD and psychological distress than those who thought they were talking to the Wizard of Oz version.
"People felt more comfortable in a discussion with Ellie," Skip told me. "They didn't feel judged, had lower interest in impression management and generally provided more information when they thought there wasn't a human in the loop."
Unsurprisingly, Skip said a lot of Ellie's popularity comes down to persistent stigmas surrounding mental illness. "With Ellie you have a virtual human that looks fairly credible and engages in a natural dialogue but there's no risk!" he said. "So you can reveal stigmatizing or embarrassing moments in your life without that risk that somebody is listening to you in the moment."
As well as using her observations to inform her immediate engagement with users, Ellie also analyses behavioural data at the end of the session to identify the presence of psychological symptoms and help clinicians make a diagnosis.
The team at ICT used existing research on nonverbal expression to come up with a list of telling behavioral signals for Ellie to look out for, including 3D head position and orientation, body posture, intensity and frequency of facial expressions and self-adaptors (such as self grooming or touching parts of the body). In addition to physical signals, Ellie was also programmed to identify and analyze voice parameters.
Stefan Scherer leads the speech analysis part of the SimSensei project. He says there are a number of acoustic indicators for depression like an absence of variances in volume and pitch, and increased tension in the vocal tract and folds. But those markers can be easily missed by the human ear, making this one area in which Ellie excels compared to her human colleagues.
So Ellie has been equipped with high-level knowledge about emotional intelligence, plus sensors that out-perform human observation. Hearing Skip talk about the various trials Ellie has gone through and all the positive feedback ICT has received on her is exciting but also a little creepy. I suggested to Skip that Ellie is a huge step towards a future where deep personal connections with robots is the norm. But Skip is surprisingly conservative about the future of Ellie and the Simsensei project.
"This is not like HAL," he said. "But I think we're going to see more AI in psychological support roles…Maybe support agents, like virtual buddies."
SimSensei agents are already being trialed to help people with high functioning autism develop emotional and interpersonal skills for job interviews, he said. "We have folks on the autism spectrum who are quite bright and quite talented. (They) can do a job but they can't get through the social function of a job interview," Skip said. "So we have six different virtual characters for that, and they can be sat at different behavioral dispositions; there's a soft touch interviewer, a hostile interviewer..."
Skip told me one of the main aims of his work at ICT is to use technology to make psychological support more accessible and he thinks that's an area where Ellie has a bright future. "Perhaps you have an older adult living alone and suffering from mild dementia. They might have a buddy, a virtual character that can pop up on their TV or their computer or whatever and remind them to take their medication and engage them in meaningful dialogue…maybe play checkers with them," he said.
"We're taking something that had an original military funded purpose, but now translating it to a mental health civilian application, and that's really what excites me the most," Skip added.