Skip to main content
Cybersecurity

An AI security company’s tango with a deepfake applicant

Jason Rebholz, co-founder and CEO of Evoke Security, had a perplexing interaction after posting about vacant roles at his company.

5 min read

Brianna Monsanto is a reporter for IT Brew who covers news about cybersecurity, cloud computing, and strategic IT decisions made at different companies.

Many people start off the new year with fun, low-stakes resolutions. Jason Rebholz, co-founder and CEO of Evoke Security, had a more unusual beginning to his 2026: He inadvertently interviewed a deepfaked candidate for a security researcher role.

It all started with a LinkedIn post. In January, Rebholz used LinkedIn to promote a few vacant roles at his AI security company. He received a message from an individual who claimed to know just the right person for an open security researcher position.

“It wasn’t somebody that I had known before, but this is kind of the nature of dealing with posting things publicly,” Rebholz said.

But as Rebholz continued to chat with this new LinkedIn contact, certain details didn’t line up. For example, the contact said the person they were recommending was based overseas, even though their résumé showed their last gig was in San Francisco. In addition, even though the contact had posted that they were open to work, they weren’t applying for the Evoke position themselves.

“He had said, ‘Oh, I just took another job,’ or whatever,” Rebholz said.

To make things weirder, the referrer also had a cartoon character as their LinkedIn profile picture instead of an actual headshot. However, Rebholz knew many security professionals like to remain private.

“Normally, I would discount that…but it kind of goes with that type of persona where I’m like, ‘Okay, it might just line up with this profile,’” Rebholz said.

The LinkedIn user offered to make an introduction between the candidate, who went by the name of Kenta, with Rebholz over email, even making sure Rebholz saw the intro email and notifying him when Kenta responded.

“I get referrals from people all the time. I’ve never had somebody come out and try to apply a pressure tactic of, ‘You need to respond right away to this person,’” Rebholz said.

Nice to meet you? After emailing back and forth with Kenta, Rebholz arranged a time to speak with them on a Google Meet call. Until this point, Rebholz hadn’t suspected he’d end up face-to-face with a deepfake.

But on the day of the meeting, Rebholz remembered, Kenta joined with his camera off.

“It took about 30 to 45 seconds for him to start the camera,” Rebholz said. “And this is when I was like, ‘This is going to be a deepfake…This does not feel right anymore.’”

Once Kenta appeared on camera, Rebholz quickly took note of his virtual background and video quality. “It just felt really off and then I started looking at other indications of his face,” Rebholz said. “It just felt very soft. It felt like I was talking to a virtual person at that point.”

A clip from Jason Rebholz's interview (no sound)

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

Although Rebholz typically likes to kick off interviews by getting a better sense of the candidate’s background, he said Kenta quickly took control of the conversation, asking questions about the role. After playing ball for a bit, Rebholz asked the candidate to go over his career history, and it became very clear something wasn’t right.

“He literally laughed at that,” Rebholz said. “And so I think it was at that point I was like, ‘Alright, this guy clearly can tell that I’m suspicious at this point.’”

The aftermath. Rebholz managed to record a portion of the interview, which he later shared with deepfake detection company Moveris. They determined there was a 95% chance that face-swapping technology was used in the clip. iProov CTO Dominic Forrest told IT Brew that Rebholz’s encounter had all the “hallmarks of a scam,” and closely followed the playbook of a traditional social engineering attack by leveraging a sense of urgency to deceive someone.

“I don’t know what the person was after, but it sounds like a low-effort attempt, if I may put it that way, rather than perhaps North Koreans trying to put somebody into a cybersecurity company or something like that.”

As IT Brew previously reported, fake North Korean IT schemes have remained a challenging problem for businesses. Threat actors are also getting creative about leveraging social platforms like LinkedIn. Forrest, for example, said he has seen threat actors regularly impersonate fake employees at his own company, making their account appear more credible by adding new iProov employees to boost their connections.

Deepfake lessons. Rebholz believes his experience, which he shared in a January post on LinkedIn, should be a wake-up call to the industry that deepfake attacks can happen to businesses of all sizes.

“There’s this perception that you got to be a large company,” Rebholz said. “But let me be the case study as a small startup. Everyone is potentially at risk here.”

He said professionals should always trust their gut if anything feels suspicious during virtual interviews. If a professional ever finds themselves in an interview with someone they suspect is a fake candidate, Rebholz added, they shouldn’t be afraid to double-check their suspicion by asking interviewees to remove virtual backgrounds or interact with an item in the frame: “It just takes that willingness to have an awkward conversation and challenge somebody to go do that.”

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.