FYI.

This story is over 5 years old.

Tech

Why Robots Need to Feel Pain

“Pain in the Machine” investigates whether pain and suffering are as essential for machine learning as they are for human cognitive development.
Spot the Robot Dog gets kicked. Video: Boston Dynamics/Cambridge University/YouTube

In "Lisa the Skeptic," a characteristically quotable episode of The Simpsons, a burning android flees a laboratory fire, robotically lamenting, "Why? Why was I programmed to feel pain?"

Video: FOX/apala734/YouTube

The question is played for laughs, but like so many memorable scenes from this most beloved of shows, it also taps into some of the deeper, overarching themes that define our modern civilization.

Advertisement

Pain is a fundamental fact of life for many organisms on our planet; a crucial mechanism for identifying what kinds of actions pose serious threats to our physical and mental health. As robots become more sophisticated and interactive, should they also be programmed to experience pain to prevent injuries to themselves or others, and if so, to what extent?

"Pain in the Machine," a 12-minute documentary released by the University of Cambridge on Monday, tackles this multifaceted and controversial issue. The film offers insights from artificial intelligence thought leaders, practicing physicians, and other interdisciplinary experts, and contrasts them with iconic popular culture moments that point to the larger philosophical questions inherent to artificially programming pain responses—including a nod to burning robot bit in The Simpsons.

"Pain in the Machine" documentary. Video: University of Cambridge/YouTube/Little Dragon Films

Like so many AI research fields, evaluating the utility and benefits of pain in robots inevitably flips the mirror back on our understanding of how those experiences function and protect us in our own lives.

"Pain has fascinated philosophers for centuries," Ben Seymour, a Cambridge-based expert on the computational and systems neuroscience of pain, comments in the documentary. "Indeed, some people consider pain to be the pinnacle of consciousness. Of course, it's not a pleasant pinnacle of consciousness but it arguably is a time where we feel most human, because we are most in touch with ourselves as a mortal human being."

Advertisement

READ MORE: Researchers Threatened a Robot With a Knife to See If Humans Cared

This idea that pain is a profoundly humanizing force, in spite of how excruciating it can feel moment-to-moment, is a standard across many eras and cultures. As the author James Baldwin put it: "[T]he things that tormented me most were the very things that connected me with all the people who were alive, or who had ever been alive."

It remains to be seen whether basic reflexive pain responses, which have already been programmed into some AI systems, could evolve into more complex emotions like empathy, or the kind of solidarity through suffering described by Baldwin. Perhaps robots could even surpass the cognitive and conceptual limits of their human creators, pioneering new approaches to interacting with the world and its inhabitants.

"Humans do seem to be no different from very complex machines made up of biological material," points out Marta Halina, a lecturer in the philosophy of cognitive science at the University of Cambridge, in "Pain in the Machine."

"That has huge implications on thinking about the future of AI, because we might be able to build machines that are as complex as us, and thus have abilities like us; for example, the ability to feel pain," Halina said. "And if we can build machines that are even more complex than humans, then they might have experiences and abilities that we can't even imagine."

Artistic ruminations on these themes, like those featured in the popular new HBO series Westworld, have vaulted these lines of speculation even further into the mainstream. "Pain in the Machine" provides an interesting glimpse into the real efforts to replicate the hardware of pain in robots, while looking ahead to the ethical, philosophical, and social issues that programmable suffering could lead to in the future.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.