Deepfakes - counterfeit content generated by AI algorithms - are on the rise, staining the internet with doctored pornography, fake videos of political leaders, and bot accounts.
There are now 14,678 deepfake videos plastered on the net, according to a report [PDF] written by Deeptrace, a startup focused on building software that can detect the machine learning forgeries. That number has shot up 100 per cent, up from 7,964 videos posted over the last nine months to the current figure.
It should come of no surprise that most of them - 96 per cent of them, in fact - are pornographic. Deepfakes first made headlines when internet perverts started using the technology to swap out the faces of porn stars for celebrities in x-rated clips, back in January last year, after all.
Creepy people began swapping fake videos of their favorite pop stars or actresses or sharing tips on how to craft your own custom porn. The code to generate these horrendous creations was all publicly available on repositories like GitHub, so with a little technical know-how and the right training data, they weren’t difficult to make. GitHub has been trying to delete DeepNude ripoffs.
Fast forward more than a year, and it’s now even easier. Computer applications like DeepNude, allow users to forge their own nude images with a few clicks. All you have to do is give the software engine a picture and it’ll paste it onto a naked body. The folks from DeepNude retracted their code when it was widely criticized, but the damage had already been done.
The DeepNude devs sold the software under a premium license to Windows and Linux desktop users for $50 a pop. Some of these people then went on to resell the app to other corrupted individuals, hoping to make some money of their own.
The report also found internet marketplaces advertising to help people produce custom deepfakes, asking for up $30 to clone victim’s voices to make someone say something they haven’t actually said in real life in audio clips, or $10 for a more simple fake text.
All these smutty fake clips have racked up a whopping 134,364,438 views across the top four porn websites dedicated to deepfakes. And guess what, 100 per cent of them targeted women.
In the small 4 per cent of deepfake content that’s not pornographic, however, 61 per cent of videos feature men. These videos normally feature Hollywood actors, political leaders, or occasionally tech CEOs.
One of the biggest concerns is that the technology could sow political discord and undermine elections. Deepfake videos have already rocked Gabon and Malaysia. An appearance of Gabonese president Ali Bongo was jittery and stiltered, prompting people to question its authenticity. The clip also appeared at a time when the President was laying low amidst an attempted coup from Gabon’s military; the government was accused of hiding behind Bongo’s supposed health issues.
Tempted to play with that Chinese Zao app for deep-fake frolics? Don't bother if you want to keep your privacyREAD MORE
In Malaysia, the Minister of Economic Affairs Azmin Ali was depicted engaging in homosexual sex acts with a rival political aide. Sodomy is illegal in Malaysia. Ali has denounced the video as a deepfake churned up to destroy his career.
And that’s not all either. Fake bot accounts on Twitter and LinkedIn using deepfake images as profile pictures are plastered on the internet too. Two of the most famous ones are for a so-called Maisy Kinsley and Katie Jones. Kinsley claimed to be a journalist at Bloomberg and was attempting to contact Tesla short sellers, whilst Jones was a researcher at a think tank hoping to spy on government officials.
AI researchers are racing ahead to develop machine learning algorithms to detect deepfakes. In many cases, judgement still relies on educated guesswork. If something looks strange, maybe it’s an earring out of place or a telltale blurry wrinkle, then it might just be a deepfake.
“The speed of the developments surrounding deepfakes means this landscape is constantly shifting, with rapidly materializing threats resulting in increased scale and impact. It is essential that we are prepared to face these new challenges. Now is the time to act,” the report concluded. ®
PS: The US state of California has approved a law that "bans the distribution of manipulated videos and pictures that maliciously aim to give someone a false impression about a political candidate’s actions or words within 60 days of an election," according to Assemblyman Marc Berman (D).
Sponsored: Ransomware has gone nuclear