This article is more than 1 year old

Australia threatens X with fine, warns Google, for failure to comply with child abuse handling report regs

Elon Musk's social network provided no response – or junk – to official inquiries about its safety practices

Australia's e-safety Commission – the education and regulatory agency devoted to keeping Australians safe online – has warned Google and fined X/Twitter for inadequate responses to inquiries on how the platforms detect, remove and prevent child sexual abuse material and grooming.

The Commission oversees Australia's Basic Online Safety Expectations – an element of the nation's Online Safety Act that allows the Commission to require platforms to demonstrate how they take reasonable steps "to proactively minimize material or activity that is unlawful or harmful, and ensuring users can use a service in a safe manner."

In February the agency sent notices to Discord, Google, TikTok, Twitch and Twitter (as it was then known) requiring them to report the measures they take to detect and address online child sexual exploitation and abuse.

The final report [PDF] on that effort found that Google and Twitter did not comply with the agency's requests.

Google was found to have "provided answers in certain instances which were not relevant, or generic, where specific information was sought" and to have aggregated information across multiple services, where information regarding specific services was required.

The ads and search giant at least engaged with eSafety to explain it could not provide detailed information by deadline and was receptive to feedback about the shortcomings of its responses. But the report found that while Google has technology called "CSAI Match" to detect inappropriate material – and freely shares it – it does not use the tech on its own Gmail, Messages and Chat services.

X/Twitter "failed to provide any response" to some questions, and in some instances "provided a response that was otherwise incomplete and/or inaccurate."

Elon Musk's social network "did not engage with eSafety during the notice period to seek any clarification that might have enabled compliance," the report states. The agency therefore "advised Twitter that it had failed to answer certain questions and gave further opportunity to provide the information or reasons the information could not be provided."

Twitter provided info after some of those interactions, but the agency "found that Twitter did not comply with the Notice to the extent it was capable."

That non-compliance saw the Commission issue a AU$610,500 ($385,000) infringement notice payable within 28 days. X can appeal the notice.

The Register contacted X's press office for comment. An auto-reply bot sent a message that states only: "Busy now, please check back later."

The report also found "In the three months after Twitter/X’s change in ownership in October 2022, the proactive detection of child sexual exploitation material fell from 90 percent to 75 percent." The microblogging network told eSafety "proactive detection rate had subsequently improved in 2023."

Other findings include that Discord does not attempt to detect CSAM in livestreams as doing so is "prohibitively expensive" and does not attempt to detect language likely to indicate abuse. Nor does X/Twitter, while Google's use of the tech is patchy.

The report notes that Google said its moderation considers material in at least 71 languages, while TikTok covers "more than 70," Twitch considers 24, Discord 29, and X/Twitter just 12.

"This means that some of the top five non-English languages spoken at home in Australia are not by default covered by Twitter, Discord and Twitch moderators," the report states. "This is particularly important for harms like grooming or hate speech which require context to identify."

X/Twitter has vastly reduced the size of its workforce, with many cuts reportedly made to platform safety teams that consider matters such as CSAM and election integrity.

Musk and X CEO Linda Yaccarino have made assurances that the platform takes its responsibilities to provide a safe service seriously – but have also earned warnings that the platform is too slow to remove misinformation.

Australia's Commissioner was told TikTok responds to CSAM takedown requests in five minutes, Twitch in eight, and Discord requires 13 hours to address direct messages. Google and X did not respond to requests for information about the speed at which they act. ®

More about

TIP US OFF

Send us news


Other stories you might like