This article is more than 1 year old

Here's how crooks will use deepfakes to scam your biz

Need some tools of deception? GitHub's got 'em

All of the materials and tools needed to make deepfake videos – from source code to publicly available images and account authentication bypass services – are readily available and up for sale on the public internet and underground forums. 

Cyber criminals are taking advantage of this easy access to resources, and using deepfakes to build on today's crime techniques, such as business email compromise (BEC), to make off with even more money, according to Trend Micro researchers. Not only that, but deepfakes are being used in web ads to make Elon Musk, security specialists, and others appear as though they are endorsing products to which they have no connection with.

"The growing appearance of deepfake attacks is significantly reshaping the threat landscape for organizations, financial institutions, celebrities, political figures, and even ordinary people," the security outfit's Vladimir Kropotov, Fyodor Yarochkin, Craig Gibson, and Stephen Hilt warned in research published on Tuesday.

Specifically, corporations need to worry about deepfakes, we're told, as criminals begin using them to create fake individuals, such as job seekers to scam their way into roles, or impersonate executives on video calls to hoodwink employees into transferring company funds or data.

Over the summer, the FBI said it has received increasing numbers of complaints relating to the use of deepfake videos during interviews for tech jobs that involve access to sensitive systems and information. 

Once they've convinced someone to hire them, deepfake actors can use fake identities to trick unsuspecting customers or coworkers into sharing payment info, or use this network access to explore IT assets, steal corporate data, deliver ransomware or worse. 

Just last month a Binance PR exec claimed crooks created a deep-fake "AI hologram" of him to scam cryptocurrency projects via Zoom video calls.

"It turns out that a sophisticated hacking team used previous news interviews and TV appearances over the years to create a 'deep fake' of me," Patrick Hillmann, chief communications officer at the crypto hyper-mart, claimed at the time. "Other than the 15 pounds that I gained during COVID being noticeably absent, this deep fake was refined enough to fool several highly intelligent crypto community members."

The Binance deepfake is notable as Trend Micro says the topic of how to bypass verification using deepfakes has been a hot one on underground forums since 2021. In general, many of these forums' users are looking for ways to scam online banking and digital finance verification, according to the security researchers. They explain: 

It is likely that criminals interested in these services already possess copies of victims' identificatory documents, but they also need a video stream of the victims to steal or create accounts. These accounts could be used later for malicious activities like money laundering or illicit financial transactions.

Additionally, deepfake production tools are bought and sold on nefarious online souks — or available in the open on GitHub — along with bots that can make deepfake video creation easier, the researchers added, citing the Telegram bot RoundDFbot as one example.

Deepfake + existing scam = more money for crooks

Trend Micro says criminals are using deepfakes for a variety of tried-and-true attack methods and scams, and the researchers expect to see more of these in the near future.

This includes messenger scams and BEC, which have proven extremely profitable even without phony videos. Miscreants can use deepfakes to impersonate executives or business partners to request money transfers, thus making these scams even more believable to the targeted victims.

Also, criminals can use stolen identities in combination with deepfake videos to open new bank accounts or create government services accounts, the security researchers warn. Similarly, criminals can take over accounts that use video calls to verify identity. 

"They can hijack a financial account and simply withdraw or transfer funds," Kropotov, Yarochkin, Gibson, and Hilt wrote. "Some financial institutions require online video verification to have certain features enabled in online banking applications. Obviously, such verifications could be a target of deepfake attacks as well."

While we've already seen deepfakes used in disinformation campaigns, notably related to the Russian invasion of Ukraine, these phony videos can also be used in extortion-related attacks – imagine fake "evidence" being created to force organizations to pay a ransom, the researchers note.

Trend Micro also puts Amazon's Alexa "on the target list of deepfake criminals." Alexa isn't alone, though. Any device that uses voice recognition — whether to reorder cat food or to open the door to a secure wing of a hospital, for example – could be hijacked by deepfakes.

The good news is that organizations can take steps to protect themselves. Top of this list is using multi-factor authentication, which, according to the security biz, "should be standard for any authentication of sensitive or critical accounts."

Use three things to authenticate users, it advises: something the user has, something the user knows, and something the user is.

Also, train staff in what to look and listen out for when it comes to deepfake technology. "For verification of sensitive accounts (for example bank or corporate profiles), users should prioritize the use of the biometric patterns that are less exposed to the public, like irises and fingerprints," the researchers advised. ®

More about

TIP US OFF

Send us news


Other stories you might like