
University Park, Pennsylvania — Mobile devices use facial recognition technology to allow users to quickly and securely unlock their phones, make financial transactions, and access medical records. However, new research involving the Pennsylvania State University of Information Science and Technology shows that facial recognition technology that employs certain user detection methods is highly vulnerable to deepfake-based attacks, making it a critical security threat to users and applications. may raise concerns about
The researchers found that most application programming interfaces that use facial liveness verification (a feature of facial recognition technology that uses computer vision to confirm the presence of a living user) are not compatible with digitally altered photographs or photographs of individuals. I discovered that it doesn’t always detect videos. A live version of someone else, also known as a deepfake. Applications using these detection methods are significantly less effective at identifying deepfakes than app providers claim.
“In recent years, we have observed significant developments in facial recognition and verification technologies, which are being deployed in many security-critical applications,” said Associate Professor of Information Science and Technology and Project Principal. Ting Wang, one of the researchers, said. “On the other hand, we have also seen significant advances in deepfake technology, making it possible to synthesize lifelike facial images and videos fairly easily at low cost. Is it possible for an attacker to exploit deepfakes to trick facial recognition systems?”
The research, presented at the USENIX Security Symposium this week, is the first systematic study of facial biometric security in a real-world setting.
Wang and his collaborators have developed a new deepfake-powered attack framework called LiveBugger. This framework is customizable and enables automated security assessment of face liveness verification. They evaluated his six major commercial facial biometric application programming interfaces provided. Researchers say vulnerabilities in these products can be inherited by other apps that use them, potentially threatening millions of users.
LiveBugger used deepfake images and videos protected from two separate data sets to try to fool the app’s facial liveness verification methods. This method aims to verify a user’s identity by analyzing static or video images of faces, listening to voices, and measuring reactions. Execute an action on a command.
Researchers have found that all four of the most common verification methods can be easily circumvented. In addition to highlighting how their framework circumvents these methods, they offer suggestions for improving the technology’s security. This includes eliminating verification methods that only analyze static images of the user’s face, and matching lip movements with the user’s voice in ways that analyze both speech and speech. Videos from our users.
“Facial biometrics can protect against many attacks, but the development of deepfake technology has created a new, hitherto unknown threat,” says a PhD student in Information Science and Technology. A student, “Our findings will help vendors fix vulnerabilities in their systems.”
The researchers have reported their findings to the vendor of the application used in the study, which has since announced plans to conduct a deepfake detection project to combat the emerging threat.
“Facial biometrics are being applied to many important scenarios, such as online payments, online banking, and government services,” said Wang. “Furthermore, more and more cloud platforms are starting to offer facial liveness verification as a platform-as-a-service. So the security of facial biometrics is of great concern.”
Wang and Li collaborated with Zhaohan Xi, a Ph.D. student in informatics at Penn State University. Li Wang and Shanqing Guo from Shandong University. Shouling Ji and Xuhong Zhang from Zhejiang University. Penn State University contributions were supported in part by the National Science Foundation.
Comments
Post a Comment