Two days into the Digital Services Act, EU wields it to deepen TikTok probe
Bloc isn't happy with made-in-China network's efforts to protect kids and data
Two days after its Digital Services Act (DSA) came into effect, the European Union used it to open an investigation into made-in-China social network TikTok.
European Commissioner Thierry Breton delivered news of the probe in a Xeet that revealed the investigation will consider "suspected breach of transparency & obligations to protect minors."
The investigation follows a preliminary investigation that saw TikTok submit a risk assessment report in September 2023. The European Commission also sought formal Requests for Information regarding illegal content on the platform, the processes it uses to protect minors, and data access.
The EU was able to ask those questions as it used the DSA to designate TikTok a Very Large Online Platform – an organization with over 45 million monthly users, which the Union requires to meet its strictest regulations.
TikTok's responses clearly did not satisfy the Commission's initial probe, leading it to open further proceedings against TikTok under the Digital Services Act.
- Days after half a billion Asians went to the polls, Big Tech promises to counter 2024 election misinformation
- Universal Music accuses TikTok of 'intimidation' and threats to replace humans with AI
- ByteDance slides around Indonesian social commerce ban with $1.5 billion buy
- X/Twitter booted out of Australia's disinformation-fighting club
The fresh investigation will consider the following matters:
- Compliance with DSA obligations related to the assessment and mitigation of systemic risks, in terms of actual or foreseeable negative effects stemming from the design of TikTok's system, including algorithmic systems, that may stimulate behavioral addictions and/or create so-called 'rabbit hole effects.' Mitigation measures will also be considered, especially age verification tools used by TikTok to prevent access by minors to inappropriate content as they are felt to perhaps not be "reasonable, proportionate and effective";
- Compliance with DSA obligations to operate appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as expressed in "recommender systems";
- Compliance with DSA obligations to provide a searchable and reliable repository for advertisements presented on TikTok;
- Measures taken by TikTok to increase the transparency of its platform, especially "suspected shortcomings in giving researchers access to TikTok's publicly accessible data as mandated by Article 40 of the DSA".
The Commission has made its investigation of TikTok a "priority" but also pointed out that the DSA does not mandate deadlines and "the duration of an in-depth investigation depends on several factors, including the complexity of the case, the extent to which the company concerned cooperates with the Commission and the exercise of the rights of defence."
If a finding of non-compliance is reached, penalties of up to six percent of global turnover, and/or the imposition of an "enhanced supervision period" during which the EU oversees platforms' efforts to comply with the DSA.
The Register has been unable to find TikTok comment on the matter but has asked the org for its reaction to the EU's actions. We will update this story if we receive a substantive response. ®