IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Report: Biometric Injection Attacks on the Rise

This type of attack involves using face-swapped videos to try to trick online verification systems, but as they rise in prevalence, so too do methods for combating them.

Female,Scans,Face,Using,Facial,Recognition,System,On,Smartphone,For
Shutterstock
Bad actors have started using AI-generated face-swap videos to trick computers that use a camera to identify users before unlocking their devices or accounts.

This malfeasance is called a face-swap injection attack, and, according to a recent report, its usage rate is growing — massively. In recent years, online verification based on taking a selfie video to match with an ID or stored biometric info has become more common. While trying to trick them is not new, past attackers have used analog tricks with photos or masks.

Now, however, a recent report has found an increased use of generative AI to swap in victims' faces. Attackers are submitting manipulated videos to verification systems by digitally “injecting" it, according to the report from biometric verification provider iProov.

Advances in generative AI have made face swaps more believable, and iProov tracked a 704 percent increase in face-swap injection attacks in the second half of 2023 versus the first half. There are now plenty of easily accessible face-swapping tools that offer free user tiers, making it an accessible attack method.

Akif Khan is a VP analyst at Gartner, focused on identity and access management, and he has also heard of a dramatic rise in injection attacks from vendors he works with, who he said have reported an increase in "the order of hundreds, if not thousands of percent in the last year."

But he gave an important caveat: that’s ramping up from a low number, as the attack form was previously barely used.

Many biometric verification vendors have methods for detecting virtual cameras, so injection attackers have been turning to emulators to disguise this, per iProov. The report found that use of emulators rose 353 percent during 2023. According to fraud prevention company SEON, emulator software helps attackers pass off devices by disguising them, making a PC laptop look like an iPhone — or even as a specific other device.

But there are various ways to spot biometric injection attacks, too.

Active liveness detection asks users to take actions like blinking or turning their heads. Such tests aren’t just looking for whether users follow instructions, but also whether anything seems suspicious as they move, Khan said.

In passive liveness detection, meanwhile, systems might shine a random sequence of colored lights onto a user’s face and see if the light bounces back, said Andrew Newell, iProov chief scientific officer. Or the system might look for tiny details like micromovements or blood flow, Khan said.

Some detection systems also use machine learning or deep neural network models trained to determine if images are real, Khan said. Humans lack visibility into how such models reach their conclusions, however, making their decisions inscrutable.

Defenders don’t need to rely on just checking the image, either. Abnormalities in other parts of the process can trigger suspicion. For example, an image whose metadata suggests it was taken on a particular iPhone model should have a certain pixel resolution. Receiving an alleged iPhone image with a different resolution is a red flag, Khan said.

Khan also said that details such as how someone moves through an online registration form, even before they reach the face verification step, provides clues. For example, a fraudster who’s filled the form out several times will move more quickly and confidently through the process, which could be revealed in mouse motions, lack of pausing, copying and pasting and other patterns.

Dr. Stephanie Schuckers, Center for Identification Technology research director, said a growing and diverse set of injection attack detection solutions have emerged. But work is still being done to assess how effective each one is and to create a standard approach.

Not all fraudsters need face swaps to trick online verification. Attackers targeting a certain victim may have an easier time putting their own face on someone's physical ID, rather than trying to use video, Khan said. Attackers could also use generative AI to create fake IDs.

Attackers and defenders will almost certainly both keep innovating. For example, many systems use a stored template. Schuckers said it's important that template databases should be defended against hackers. This is an emerging area, with vendors starting now to develop defenses to safeguard templates without interrupting use in biometric matching.
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.