A hack that fools Face Recognition AI into false identification

Face recognition AI is increasingly being used at Airports and at other security outlets, especially during a pandemic to heed to proper security measures of identifying people while maintaining social distancing but a recent discovery by McAfee, a cybersecurity firm has proved that these Face Recognition systems are not all that perfect.

camera 1124585 1280

Researchers at McAfee tested a face recognition system similar to the ones used at Airports for passport verification- they fed the system an image created by machine learning that looks like one person but is recognized as someone else by the face recognition software. This could allow someone to board a flight (who is on the no-flight list) as someone else who has the booking.

“If we go in front of a live camera that is using facial recognition to identify and interpret who they’re looking at and compare that to a passport photo, we can realistically and repeatedly cause that kind of targeted misclassification,” said the researcher, Steve Povolny.

To trick the face recognition algorithm the researchers at McAfee used CycleGAN, which is an image translation algorithm that could transform your picture to make it look like something painted by Monet or make a summer picture look like a winter one.

The team used 1,500 photos of the project leads to be transformed by CycleGAN and after hundred of tries, CycleGAN created an image that the face recognition recognized as someone else instead of whom the human eye perceived.

But there are two concerns with the study- first, that the researchers had a similar face recognition system as they do at the airport security but not the same.“I think for an attacker that is going to be the hardest part to overcome, where [they] don’t have access to the target system” said Povolny. Second, CycleGAN takes time to create such an image and the software requires a high-end system to work functionally.

 The researchers aimed at the study to point out the vulnerability of Face recognition systems and the dangers of relying solely on these checks.

“AI and facial recognition are incredibly powerful tools to assist in the pipeline of identifying and authorizing people,” Povolny says. “But when you just take them and blindly replace an existing system that relies entirely on a human without having some kind of a secondary check, then you all of a sudden have introduced maybe a greater weakness than you had before.”

If you like the site, please consider joining the telegram channel or supporting us on Patreon using the button below.

Patreon

Original Source