This article is more than 1 year old
Big Tech has failed to police Russian disinformation, EC study concludes
In Putin's Russia, the planet hacks you
The power of the EU's Digital Services Act (DSA) to actually police the world's very large online platforms (VLOPs) has been tested in a new study focused on Russian social media disinformation.
The independent study of the DSA's risk management framework published by the EU's executive arm, the European Commission, concluded that commitments by social media platforms to mitigate the reach and influence of global online disinformation campaigns have been generally unsuccessful.
The reach of Kremlin-sponsored disinformation has only increased since the major platforms all signed a voluntary Code of Practice on Disinformation in mid-2022.
"In theory, the requirements of this voluntary Code were applied during the second half of 2022 – during our period of study," the researchers said. We're sure you're just as shocked as we are that social media companies failed to uphold a voluntary commitment.
Between January and May of 2023, "average engagement [of pro-Kremlin accounts rose] by 22 percent across all online platforms," the study said. By absolute numbers, the report found, Meta led the pack on engagement with Russian misinformation. However, the increase was "largely driven by Twitter, where engagement grew by 36 percent after CEO Elon Musk decided to lift mitigation measures on Kremlin-backed accounts," researchers concluded. Twitter, now known as X, pulled out of the disinformation Code in May.
Across the platforms studied – Facebook, Instagram, Telegram, TikTok, Twitter and YouTube – Kremlin-backed accounts have amassed some 165 million followers and have had their content viewed at least 16 billion times "in less than a year." None of the platforms we contacted responded to questions.
Telegram is not classified as a VLOP under the DSA, and so does not have to comply with the Act until next year when it goes into effect for all online platforms.
"There were instances of effective mitigation that targeted very specific accounts and reduced the risk level for the audiences of those channels," the researchers concluded in their study. "However, at the systemic level of all accounts on the platforms engaged in Kremlin disinformation campaigns, the mitigation measures failed."
Narrow scopes of mitigation policies – like only policing known Kremlin-affiliated accounts – allowed easy circumvention, and in cases where moderation was successful, offending accounts just directed users elsewhere, as cross platform manipulation was ignored entirely. Platforms "rarely reviewed and removed more than 50 percent of the clearly violative content," and inconsistently applied pre-existing policies banning hate speech and incitement to violence.
The end result? Despite commitments to prevent it, "The reach of these pro-Kremlin networks has more than doubled since the war began."
Too Big Tech to handle?
The EU's Digital Services Act and its requirements that VLOPs (defined by the Act as companies large enough to reach 10 percent of the EU, or roughly 45 million people) police illegal content and disinformation became enforceable late last month.
Under the DSA, VLOPs are also required "to tackle the spread of illegal content, online disinformation and other societal risks," such as, say, the massive disinformation campaign being waged by the Kremlin since Putin decided to invade Ukraine last year.
The researchers also examined how the DSA's rules could be used to limit the spread of Russian disinformation, whether it rose to the standards that would require VLOPs to police it and whether they did so.
"The Kremlin's ongoing disinformation campaign… causes risks to public security, fundamental rights and electoral processes inside the European Union," the study concluded, meaning it triggered the need for a VLOP response dictated by Articles 34 and 35 of the DSA. That response was piecemeal at best.
Crucially, however, the VLOPs weren't required to take steps to limit the spread of disinformation as required by the DSA – those requirements only went into effect, as mentioned above, in late August.
Now that VLOPs are bound by the DSA, will anything change? We asked the European Commission if it can take any enforcement actions, or whether it'll make changes to the DSA to make disinformation rules tougher, but have yet to hear back.
Two VLOPs are fighting their designation: Amazon and German fashion retailer Zalando. The two orgs claim that as retailers, they shouldn't be considered in the same category as Facebook, Pinterest, and Wikipedia.
- First pushback against EU's Digital Services Act and it's not Google
- Amazon, Bing, Wikipedia make EU's list of 'Very Large' platforms
- Europe doesn't just pass laws on Big Tech algorithms, it sets up cop shops to police them
- Children should have separate sections in social media sites, says UK coroner
In the meantime, we already had a glimpse into how DSA compliance might fail shortly before VLOP compliance rules went into effect last month, and that glimpse also suggested the VLOPs weren't ready. The rest of the tech world will have to fall into line from February next year. It appears that they're unprepared yet again, but so far the DSA has failed to change a thing. ®