This article is more than 1 year old

Apple wants to scan iCloud to protect kids, can't even keep them safe in its own App Store – report

Tech Transparency Project accuses iGiant of lip service to child safety

Apple, having recently invoked the "think of the children" defense against rivals seeking to open competing iOS App Stores, has been accused of not thinking of the children.

In a report released on Wednesday, the Tech Transparency Project contends that Apple "is failing to take even the most basic steps to protect children" in the App Store. Failures in age verification exposed children to pornography, gambling, and a host of other supposedly age-limited apps.

"Using an Apple ID for a simulated 14-year-old, TTP examined nearly 80 apps in the App Store that are limited to people 17 and older – and found that the underage user could easily evade age restrictions in the vast majority of cases," the TTP report said.

For example, the TTP cited a dating app that presents pornographic images before asking the user's age – something that presumably should have been caught in the app review process. It also pointed to adult chat apps with explicit images that include no age check and a gambling app that allows minors to deposit and withdraw funds.

The group argues that Apple's approach to child protection is fundamentally flawed because "Apple and many apps essentially pass the buck to each other when it comes to blocking underage users" and because Apple tolerates age verification methods that try to avoid learning that that user is underage.

The Tech Transparency Project, launched in March 2020 as an expansion of the Campaign for Accountability's Oracle-backed 2016 Google Transparency Project, describes itself as a public interest concern for "exploring the influence of the major technology platforms on politics, policy, and our lives."

The story weakens

Evidence of Apple's inaction despite being aware of dangerous developer practices was presented in the recent Epic Games v. Apple trial. Among the chat conversations entered into the court record is one in which Eric Friedman, head of Apple's FEAR (Fraud Engineering, Algorithms and Risk) team, said:

"[W]e know that developers on our platform are running social media integrations that are inherently unsafe. We can do things in our ecosystem to help with that. For example 'ask to chat' is a feature we could require developers to adopt and use for U13 [under age 13] accounts."

Apple has come under fire previously for exaggerating its app safety claims. In 2019, the Washington Post reported on the abundance of complaints over inappropriate content in chat apps and questioned Apple's assertion that the App Store is a "safe and trusted" place. TPP said its recent findings suggest little has been done since then to make the App Store more kid-friendly.

Apple recently launched a "child safety" initiative that will alert parents when kids send or receive explicit images using its Messages chat app. The effort will also use customers' own iDevices to scan iCloud-bound images in order to report anyone foolish enough to sync a sufficient amount of child sexual abuse material via Apple's servers – which turn out to be not a particularly safe place to store things for those prone to trust email senders claiming to represent Apple support.

The iBiz's CSAM scanning plan provoked a fierce response from advocacy organizations that characterized the project as a "backdoor" in an open letter asking Apple to reconsider. Another such missive went to Apple CEO Tim Cook.

The Register asked Apple to comment but the company did not reply, perhaps out of concern for its privacy. ®

More about

TIP US OFF

Send us news


Other stories you might like