Skip to main content
Cybersecurity

Why HR and cybersecurity need to collab on spotting fake IT workers

One security pro provides a red-flag list for HR.

4 min read

TOPICS: Cybersecurity / AI & Emerging Tech / Deepfake Detection

In February 2025, Katelyn Halbert, market intelligence manager at deepfake-detection company Pindrop, noticed that an interviewee on a video call seemed a bit off. Their answers were not only vague, but delayed by a few seconds. While Halbert is no stranger to a nervous job candidate, she noticed the interviewee’s lip movements weren’t syncing to the audio—and that was enough to make her go straight to her team’s chief people officer and VP of research.

“That first moment, I was very taken aback and honestly intimidated,” Halbert told us.

Halbert’s candidate didn’t land the gig, but the interview made her realize that HR needs to be aware of the tech-aided tactics that fake IT candidates will employ during interviews. These days, finding the right person for the job—and a real person for the job—can require some collaboration between HR and cybersecurity teams.

Apply now! Tim Rawlins, senior advisor and security director at cybersecurity company NCC Group, said deepfakery in interviews frequently stems from state-run schemes backed by North Korea, in which stolen or entirely synthetic identities are used to obtain IT jobs advertised as fully remote. If the fraudsters are hired, they bypass external defenses, operate as insiders, and gain access to a corporate network. (The FBI has said that North Korea also dispatches remote IT workers “to generate revenue for the regime.”)

In a recent LinkedIn post, Amazon Chief Security Officer Stephen Schmidt said the company, since April 2024, had stopped “more than 1,800 suspected DPRK [North Korean] operatives” from joining the team. And in a March 6 advisory, Microsoft wrote that threat actors are leveraging AI to “get hired, stay hired, and misuse access at scale.”

And bad news, defenders—deepfakes are getting good.

What to do. With HR often handling the first interview in any given hiring cycle, its team members have to spot early signs of suspicious behavior. Halbert’s practices include making video mandatory (i.e., no more phone calls), and using the company’s own fraud-detection tech to watch for suspicious locations and VPN use (which hides true location).

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

By subscribing, you accept our Terms & Privacy Policy.

Rawlins recommends HR professionals find a way to ask candidates about their current location to check local knowledge—a football team, a train station, a nearby park. And watch for when a candidate repeats a question, because they might just be putting that query into an AI bot to find the answer.

(TechCrunch reported on a recent tactic for exposing a DPRK deepfaker: Ask them to insult their leader.)

Cybersecurity teams need to help HR. Security should provide a list of “red flags” for their HR counterparts, according to Rawlins. Suspicious signs could include:

  • Refusal to verify address
  • Mismatches, even minor ones, across official documents and application materials
  • Environmental inconsistencies (is it nighttime, for example?)
  • Obvious use of AI in photos

“The recruitment really does need to be a security control, and the pre-employment screening has got to be part of your threat assessment for the organization,” Rawlins said.

And HR needs to help cybersecurity teams. Rawlins also advised HR teams to escalate signs of suspicion to the cybersecurity team or other relevant parties, like Halbert did.

Pindrop, a deepfake-detection company, has an especially important reason for learning about the latest deception tactics—and that’s why their HR and cybersecurity teams work together frequently.

“Half of my recruiting team now is actively trying to interview fake candidates,” Vijay Balasubramaniyan, CEO and co-founder at Pindrop, told us in January. “Whenever we see something weird, we’re actually saying, ‘Hey, we want to do an interview with you.’”

But to be an effective deepfake detector, HR professionals have to go against what they’re taught, Halbert said, which is to not make quick judgments and to give people the benefit of the doubt. In addition, HR folks might not be used to deeply scrutinizing a candidate’s current location down to parks supposedly near their house.

“All of these things that we’re now starting to make a judgment on,” Halbert said. “That isn’t a natural feeling for most HR folks.”

About the author

Billy Hurley

Billy Hurley has been a reporter with IT Brew since 2022. He writes stories about cybersecurity threats, AI developments, and IT strategies.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

By subscribing, you accept our Terms & Privacy Policy.