Researchers Defeated Advanced Facial Recognition Tech Using Makeup
A new study used digitally and physically applied makeup to test the limits of state-of-the-art facial recognition software.
Researchers have found a new and surprisingly simple method for bypassing facial recognition software using makeup patterns.
A new study from Ben-Gurion University of the Negev found that software-generated makeup patterns can be used to consistently bypass state-of-the-art facial recognition software, with digitally and physically-applied makeup fooling some systems with a success rate as high as 98 percent.
In their experiment, the researchers defined their 20 participants as blacklisted individuals so their identification would be flagged by the system. They then used a selfie app called YouCam Makeup to digitally apply makeup to the facial images according to the heatmap which targets the most identifiable regions of the face.. A makeup artist then emulated the digital makeup onto the participants using natural-looking makeup in order to test the target model’s ability to identify them in a realistic situation.
“I was surprised by the results of this study,” Nitzan Guettan, a doctoral student and lead author of the study, told Motherboard. “[The makeup artist] didn’t do too much tricks, just see the makeup in the image and then she tried to copy it into the physical world. It’s not a perfect copy there. There are differences but it still worked.”
The researchers tested the attack method in a simulated real-world scenario in which participants wearing the makeup walked through a hallway to see whether they would be detected by a facial recognition system. The hallway was equipped with two live cameras that streamed to the MTCNN face detector while evaluating the system’s ability to identify the participant.
“Our attacker assumes a black-box scenario, meaning that the attacker cannot access the target FR model, its architecture, or any of its parameters,” the paper explains. “Therefore, [the] attacker’s only option is to alter his/her face before being captured by the cameras that feeds the input to the target FR model.”
The experiment saw 100 percent success in the digital experiments on both the FaceNet model and the LResNet model, according to the paper. In the physical experiments, the participants were detected in 47.6 percent of the frames if they weren’t wearing any makeup and 33.7 percent of the frames if they wore randomly applied makeup. Using the researchers’ method of applying makeup to the highly identifiable parts of the attacker’s face, they were only recognized in 1.2 percent of the frames.
The researchers are not the first to demonstrate how makeup can be used to fool facial recognition systems. In 2010, artist Adam Harvey’s CV Dazzle project presented a host of makeup looks designed to thwart algorithms, inspired by “dazzle” camouflage used by naval vessels in World War I.
Various studies have shown how facial recognition systems can be bypassed digitally, such as by creating “master faces” that could impersonate others. The paper references a study where a printable sticker was attached to a hat to bypass the facial recognition system, and another where eyeglass frames were printed.
While all of these methods might hide someone from facial recognition algorithms, they have the side effect of making you very visible to other humans—especially if attempted somewhere with high security, like an airport.
In the researchers’ experiment, they addressed this by having the makeup artist only use conventional makeup techniques and neutral color palettes to achieve a natural look. Considering its success in the study, the researchers say this method could technically be replicated by anyone using store bought makeup.
Perhaps unsurprisingly, Guettan says she generally does not trust facial recognition technology in its current state. “I don’t even use it on my iPhone,” she told Motherboard. “There are a lot of problems with this domain of facial recognition. But I think the technology is becoming better and better.”