Meta, X sign up to Euro Commish code of conduct on hate speech

Under Digital Services Act, monitors will be allowed to report abusive language and platforms should respond in 1 day

Online platform companies, including X and Meta, have signed up to a new code of conduct aimed at targeting online hate speech, which the European Commission has now baked into the Digital Services Act.

The DSA was passed in July 2022 to create "a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses."

musk

EU demands a peek under the hood of X's recommendation algorithms

READ MORE

Article 45 offers a mechanism to devise codes of conduct within the Act, one of which is to address online hate speech while protecting freedom of expression. A revised "Code of conduct on countering illegal hate speech online" has now been integrated into the framework of the Digital Services Act (DSA), which encourages voluntary codes of conduct to tackle risks online.

Built on the existing 2016 Code of conduct on countering illegal hate speech online, the new code has been signed by vid streaming platform Dailymotion, Facebook, Instagram, Jeuxvideo.com, LinkedIn, Microsoft hosted consumer services, Snapchat, Rakuten Viber, TikTok, Twitch, X and YouTube, the European Commission said.

Observers might be surprised Meta’s platforms and X — owned by Elon Musk — have signed up to the code. The latter is currently under a DSA investigation looking into recent changes in its algorithms, under proceedings opened in December 2023, just after the law came into force.

In May last year, the European Commission opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors, and has investigated the platform for misinformation too. The social media giant has also ditched fact-checking moderators for the US.

However, the new codes seems to be work in progress. The Commission said it would monitor and evaluate the achievement of the Code of conduct+ objectives, as well as their recommendations, and facilitate the regular review and adaptation of the Code.

“This process will be part of the continuous monitoring of platforms' compliance with existing rules,” it said in a statement.

The new code proposes a network of not-for-profit or public sector "Monitoring Reporters" with expertise on illegal hate speech could regularly assess signatories' compliance with hate speech rules, while they may also include so-called "Trusted Flaggers" to alert companies to problematic content. Participants must commit to review at least two-thirds of hate speech notices received from monitoring reporters within 24 hours or to make their "best effort" to do so.

Signatories have additionally committed to take part in "structured multi-stakeholder cooperation with experts and civil society organizations that can flag the trends and developments of hate speech that they observe, helping to prevent waves of hate speech from going viral".

X Corp and Media Matters for America are set to go to trial this year following a judge's refusal to toss the billionaire's lawsuit. The case builds on research reported in November 2023, which documented ads on X from companies like IBM, Apple, Oracle and AT&T appearing alongside posts promoting hate speech. X said the not-for-profit campaign group's research only followed major brands and racist trolls in an effort to stack the deck for its purposes. ®

More about

TIP US OFF

Send us news


Other stories you might like