Cybersecurity

With deepfakes, facial recognition can’t fight alone

Even with deepfake-driven doubt, IT pros aren’t bailing on biometrics.
article cover

Francis Scialabba

4 min read

The market intelligence firm Gartner made a prediction: Some businesses just won’t like your face—as an authentication factor, that is.

The advisory stated that a rise in AI-based deepfake attacks on face biometrics will make 30% of enterprises doubt the reliability of identity-verification option “in isolation” by 2026.

While the prediction arrives alongside new reports of deepfake-led compromises, industry pros who spoke with IT Brew aren’t ready to do an about-face on face-auth. The biometric option just needs to include other factors—like behavioral and location-specific analysis—to keep facial recognition out of isolation.

“We just can’t give up on a very wonderful authentication method because it’s got vulnerabilities. Instead, what we need to do is constantly and consistently up our game so that digital trust really is trusted and enables us to exist on the wild, wild west that is the internet,” Sushila Nair, member of the ISACA Emerging Trends Working Group and cybersecurity practice lead at Capgemini North America, told IT Brew.

And deepfakes have gotten, in a word, wild.

In September 2023, joint guidance from CISA, the FBI, and NSA, summarized the threats of synthetic media, noting that malicious actors create convincing likenesses for financial gain.

“These may include impersonating key leaders or financial officers and operating over various mediums using manipulated audio, video, or text to illegitimately authorize the disbursement of funds to accounts belonging to the malicious actor,” the report read.

VP analyst Akif Khan, who fields calls annually at Gartner from ecommerce, government, banking, and other business clients, notes two emerging deepfake attacks.

In a presentation attack, someone uses their camera to capture an image or video and “present” it to another device. In an injection attack, the synthetic video is “injected” into a vendor’s API, tricking a system into believing that the likeness.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

A threat intelligence report from the identify-verification provider iProov noted a growing interest in “face swap” tools—the kinds of products that frequently have free tiers—that aim to fool biometric authentication providers.

“I think there’s a growing understanding that these deepfakes are increasingly convincing, and tools to make them are increasingly available. So, I think those things really just combine to create doubt in people’s minds,” Khan said.

IT Brew reported in August of an increasing ease of use for deepfakers.

Authentication providers like Duo Security support biometric options like Face ID, but they also feature a “Trusted Endpoints” feature that uses factors like device certificates to verify a known device.

“We’ve never encouraged anybody to rely solely on facial authentication, even though it’s the easiest thing from a user-facing perspective,” Wendy Nather, head of advisory CISOs at Cisco, which acquired Duo in 2018, told IT Brew.

Forward-looking vendors are layering in multiple detection capabilities that augment the facial-recognition factor, according to Khan, including location intelligence (detecting, say, multiple identities from the same wi-fi network) or behavior analysis.

“Let’s say I’m an attacker, and I’ve got 10 fake IDs here, and I’ve gone through this process 10 times today. Unless I’m very disciplined, by the tenth time, I am probably going to display a level of navigational fluency that a one-off user would not be displaying. Maybe I’m not going to read the terms and conditions, because I’ve seen them nine times today already,” Khan said, who recommends not using facial-recognition features in isolation.

“By layering in these other capabilities…it may be that you detect the attack, but it’s not because you detected the deepfake. You’ve actually detected all of this other metadata around it that suggests you that something is wrong here.”

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.