FYI.

This story is over 5 years old.

Tech

These Glasses Fool Facial Recognition Into Thinking You’re Someone Else

Hack the system with a simple but effective form of camouflage.

Looking to stay out of trouble? Now you can avoid facial recognition, or even "frame" someone else (get it?) by impersonating them with new eyeglasses developed by researchers at Carnegie Mellon University.

The glasses, recently described in this paper and featured on the excellent blog Prosthetic Knowledge, contain patterns on the front of the frames that obscure the identity of the wearer to facial recognition algorithms. For example, one of the researchers who developed the glasses, a white male, passed for actress Milla Jovovich. His colleague, a South-Asian female, successfully impersonated a Middle Eastern male.

Advertisement

The glasses were developed to mislead facial recognition programs that use neural networks (a form of advanced machine learning that mimics the human brain), said researcher and co-creator Mahmood Sharif. The specs had a 90 percent success rate in fooling the facial recognition software Face++, which is used for detection, tracking, and analysis, such as noting a person's age, gender, or identity.

They should be "physically realizable and inconspicuous."

Sharif said that ideally, the glasses should look ordinary, just like another pair of colorful tortoiseshells. They should be "physically realizable and inconspicuous," the researchers write.

Their goal is to disrupt the recognition software's neural networks, which use pixel coloration to guess someone's identity, comparing the information to other images. A slight change in someone's face can completely throw off the system. So the researchers designed the glasses so that their pattern would disrupt the system's perception, and its ability to correctly read the pixelation.

Image: Carnegie Mellon University

"Our algorithm solves an optimization problem to find what the colors of the eyeglasses should be to achieve impersonation or dodging," he told Motherboard. "Our algorithm also tries to ensure that the transition between colors are smooth (similarly to natural images), the the front plane of the glasses can be printed by a commodity printer, and that the glasses will fool the face-recognition system, even if images of the attacker are taken from slightly different angles or the attacker varies her facial expression."

Advertisement

Machine learning and facial recognition systems are more common than you might think. Facebook uses it to tag friends automatically, while about half the US population are included in police facial recognition databases. So understanding how to fool the algorithm, and then build defenses against such attack, may be important for security as we become more reliant on machine learning.

Fooling facial recognition can allow a criminal to evade surveillance or get access to things they shouldn't be able to access, said Sharif. But a more benign purpose for facial recognition could be to gain privacy.

"Our work shows that one particular kind [of algorithm] — face recognition based on deep neural networks — can be misled by an adversary who has the ability to do nothing except relatively alter her appearance," said Sharif. "Before we deploy deep neural networks in safety or security-critical contexts, we need to better understand how to make them robust in the face of malicious interference."

The researchers said that aside from showing how people can fool image recognition, it also shows the limit of other systems that rely on machine learning, such as new cancer diagnosis and self-driving cars.

"As our reliance on technology increases, we sometimes forget that it can fail," the researchers write in their paper. "In some cases, failures may be devastating and risk lives. Our work and previous work show that the introduction of machine learning to systems, while bringing benefits, increases the attack surface of these systems."

The researchers advised that these machine learning algorithms should be be improved to withstand evasion, such as the kind their glasses accomplishes.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.