FYI.

This story is over 5 years old.

Tech

Computers Know When You're Faking It

A machine has now outwitted humans at a perception-related task. That's pretty big.
Can you tell which expressions are real pain? The one on the left is faking it. Image: University of California San Diego

As social creatures, humans evolved to develop facial expressions to indicate to others how we feel. Then we evolved further to be able to mimic those expressions, so people can't tell how we really feel. If you ask me, the second part is just as crucial to social harmony as the first. Imagine if you couldn't ever mask your true emotions? It'd be anarchy.

This is why I find artificial intelligence research of the sort published this week in the journal Current Biology somewhat unsettling. A team from the University of California, San Diego and the University of Toronto found that a computer system was able to tell when people were faking a pained expression even better than humans could. Much better, in fact. The humans couldn't tell the difference between the real and fake looks any better than if they just guessed randomly, the report found, but the computer got it right 85 percent of the time.

Advertisement

We already know machines are more intelligent than us, and getting smarter all the time, and now they're inching closer to becoming sentient—starting to recognize natural language, detect tone in text, pick up on social cues, and "feel" empathy. The researchers say this study is one of the first examples of computers being better than humans at a perception-related task. That's pretty big.

As interesting as it all is, the possibility of our mechanical brethren calling us out on a little white lie is a dystopian scenario I'd really rather avoid. And yet, research into things like facial recognition and machine vision is blowing up.

Marian Bartlett, the lead study author, has been at the forefront of such research, and spun a company out of it that makes software based on facial expression detection, called Emotient. She's working on programming machines to better recognize emotions through people's facial cues; so far it’s got the basics down: joy, sadness, anger, fear, and disgust.

This latest study found that the mouth in particular is a dead giveaway that you're faking pain. The fakers opened their mouths more regularly, versus organic, irregular patterns in the authentic looks, the report found.

The theory is that's because voluntary and involuntary nerve responses, and the corresponding facial movement, are controlled by two different motor pathways routes in the brain: the subcortical extrapyramidal motor system when the feeling’s reflexive and spontaneous, and the cortical pyramidal motor system when it’s deliberate and intentional.

The distinction is subtle and easily missed by a human, even when they're looking for it. But a computer can be programmed to pick up on these structural cues.

Think about what that could mean. There could soon be robots or wearables that can read people with superhuman accuracy, thwarting our best efforts to hide a feeling that's inconvenient, or worse. Beyond the inevitable personal embarrassment, there are potential implications for law enforcement, advertising, and a host of other social situations that rely on astute perception.

Actually, it's already starting. There’s a Google Glass app in beta that  says it gives a real-time readout of the emotional expressions of people in your field of view, as Wired pointed out. There's also a car that can tell when you're tired or not paying attention.

As far as pain detection goes, that could be useful for medical professionals trying to assess a patient's state. It certainly beats our current method, which couldn't be more low-tech, or human:  a doctor asking, "Tell me how much it hurts on a scale of 1 to 10"—and just believing you.