EU probes Meta over its provisions for protecting children

Has social media biz done enough to comply with Digital Services Act? Maybe not

The European Commission has opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.

The action follows a separate probe into Meta's alleged failure to properly monitor the distribution of political misinformation by "foreign actors" before June's European elections.

The latest action means the executive of the European Union will examine three potential breaches of the DSA, first introduced in August last year. The legislation allows for fines up to 6 percent of worldwide annual turnover, which in Meta's case would equate to around $8.5 billion.

Firstly, the Commission will investigate whether the social media giant has assessed and mitigated risks caused by the design of Facebook and Instagram's online interfaces, "which may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce the so-called 'rabbit hole' effect."

Secondly, the Commission will examine whether Meta did enough to prevent access by minors to inappropriate content. In particular, it questions whether age-verification tools used by Meta have been reasonable, proportionate and effective.

Lastly, the Commission put Meta's compliance with DSA obligations under the microscope by looking at whether it has taken "appropriate and proportionate measures" to ensure a high level of privacy, safety, and security for minors. In particular, it will examine whether default privacy settings for minors and recommendation systems comply with the law.

Each strand of the investigation relates to Articles 28, 34, and 35 of the DSA respectively. The Commission is set to continue the investigation by sending additional requests for information, and conducting interviews or inspections.

Thierry Breton, commissioner for the internal market, said the Commission was not convinced that Meta has done enough to comply with the DSA on Facebook and Instagram.

"We will now investigate in-depth the potential addictive and 'rabbit hole' effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems. We are sparing no effort to protect our children," he said.

A Commission official noted that it will investigate Facebook and Instagram separately as they are distinct platforms under the legislation. It does not have a fixed timetable for these investigations, they said.

The European Commission has been busy enforcing the DSA since it was introduced. Last month it gave TikTok 24 hours to explain the risk assessment procedures it applied before launching TikTok Lite, which offers the chance to "complete challenging tasks and earn great rewards!"

Last December Elon Musk's X became the first online platform to have formal DSA proceedings brought against it, with the Commission accusing the microblogging platform of disseminating illegal content among other violations of the recently enacted rule.

An official said X had delayed the launch of features in Grok, its AI bot, until after the European election, recognizing that some of these features may have "risks in the context of civic discourse and elections." The investigation into X is ongoing and the Commission remains in close contact with the platform.

Meta told The Reg: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools, features and resources designed to protect them.

"This is a challenge the whole industry is facing, which is why we’re continuing to advance industry-wide solutions to age-assurance that are applied to all apps teens access. We look forward to sharing details of our work with the European Commission.” ®

More about


Send us news

Other stories you might like