FYI.

This story is over 5 years old.

Tech

Computers Can Read Emotions Better Than You Can

A new computer can distinguish between 21 different 'complex emotions.'
Image: Aleix Martinez

Last week, we learned computers can tell when you’re lying, and now they might be better at reading emotions than you are.

Researchers at Ohio State University programmed a computer to recognize facial expressions based on more than 5,000 images taken from 230 volunteers who were responding to verbal cues such as “you smell a bad odor” or “you got some unexpected news.”

For a while now, facial analysis software has been able to distinguish between the six “basic categories” of emotion—happiness, surprise, anger, sadness, fear, and disgust. If you asked me to do the same, I could probably do it. But when you drill down into complex, compound facial expressions such as “happily surprised,” “fearfully angry,” “appalled,” “hatred,” and “awed,” I’d probably blow a couple of them. This computer doesn’t. In fact, it can decipher between 21 different “complex emotions.”

Advertisement

Take a look at these pictures:

Image: PNAS

If one of your friends made the face for “sadly disgusted,” would you be able to tell the difference between that and “fearfully angry?” Maybe if I knew the context and hung out with the person all the time I could make a reasonable guess, but that’s all it’d be—a guess. Maybe that’s why I so often find myself asking (usually jokingly) “what’s with the face?”

Not so with the computer. Turns out that there’s surprisingly little variety between Americans when it comes to facial expressions. Nearly everyone activates the exact same muscles to express “happiness,” but they also activate the same muscles to show off “happily surprised.”

That’s part of the usefulness of this research—by using a computer to analyze facial expressions, we can show that we’re not all that different after all. But beyond that, lead researcher Shichuan Du thinks that face-analyzing computers can be used to diagnose certain mental disorders such as autism, schizophrenia, and PTSD, because people suffering from those disorders often react differently to social cues than others.

“A particular area of interest is the perception of facial expressions of compound emotions in psychiatric disorders, social and cognitive impairments, and studies of pain,” Du wrote in Proceedings of the National Academy of Sciences.

But, more obviously, it’s another step towards machines that can decipher what we feel, tell us apart, and talk to use. In that context, it’s easy to imagine a future filled with robotic companions and therapists. As for me, I'll still be trying to figure out what the heck the emoticon >:-O means.