Cybersecurity

University of Waterloo team breaks voice-authentication systems

As underground forums increasingly ask about deepfakes, researchers find a way past voice verification.
article cover

Athima Tongloom/Getty Images

less than 3 min read

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

Audio-authentication mechanisms, sometimes used in banking or other over-the-phone transactions, have high-tech methods of distinguishing a fake voice from a real one. So naturally, academics are coming up with high-tech methods that could help hackers use a fake voice to avoid detection.

University of Waterloo researchers, embracing their inner cat and mouse, found a way to get past audio-deepfake countermeasures by adding some lifelike imperfections and subtracting some sound signatures.

The demonstration fooled many voice-authentication (VA) systems within six tries, according to the U of ’Loo crew, who said the hack “call[s] into question the security of modern VA systems.”

“It was an academic paper that’s trying to be preventative, but I’m sure that in the future, we’ll hear about it in the real world,” Andre Kassis, a computer security and privacy PhD candidate who was the lead author of the study, told IT Brew.

The tests were conducted against “five renowned ASVs,” or automatic speaker verification systems, including Amazon Connect Voice ID.

After gathering audio of a target, Kassis’s team used it to create the required authentication phrase. A verification system will normally find signatures in copycat audio and reject the attempt, but “then our attack comes into play,” Kassis told us. After some trial and error, the Waterloo researchers applied three main wave transformations to sneak past the security: introducing realistic background noise, eliminating digital artifacts not present in human speech, and “increasing the magnitude of higher frequencies.”

Voice and vice. Audio-authentication hacks have already hit the real world. In February 2023, a Vice reporter used an AI-powered voice replica to break into his bank account.

Deepfake services are a hot topic on underground forums, according to a September 2022 report from Trend Micro.

“In these discussion groups, we see that many users are targeting online banking and digital finance verification,” the cybersecurity company said. It also recommended that bank-account verifiers use“biometric patterns that are less exposed to the public, like irises and fingerprints.”

A 2022 report from The Insight Partners cited in the University of Waterloo paper predicted that “the global voice biometrics market share is expected to grow from $1.31 billion in 2021 to $4.82 billion by 2028.”

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.