This article is more than 1 year old

Children should have separate sections in social media sites, says UK coroner

Also recommends age verification as Meta tells The Reg it's working on parental controls

UK coroner Andrew Walker has sent a report to Meta, Pinterest, Twitter, Snapchat, and the government itself recommending that adults and children each have their own parts of the platform to prevent harm to youngsters.

The coroner's report [PDF] comes after the death of 14-year-old Molly Russell in 2017, with Walker concluding that the social media content she saw – "concerned with self-harm [and] suicide" – had "likely contributed" to the child taking her own life.

He added that "the platforms' algorithms" sometimes resulted in "binge periods of images, video clips" that hadn't been asked for and that she had requested help from celebrities, "not realizing there was little prospect of a reply."

Walker also asked that the platforms control content so that it is age specific, that children's accounts could be "linked to the parent, guardian or carer's account for monitoring," and that age verification for users when signing up to the online platforms be put into place.

Twitter requires all users be at least 13 years old to have accounts, as does Meta, which also has a Messenger Kids app, which operates separately from the main Facebook product. Pinterest and Snapchat also require their users to be a minimum of 13 years of age, though obviously this is not enforced. Existing parental safety apps mostly allow adults to block access to social media sites entirely.

Pinterest said it was "committed to making ongoing improvements to help ensure that the platform is safe for everyone" and said "the Coroner's report will be considered with care." Snapchat declined to make a statement.

Meta said: "We're committed to making Instagram a safe and positive experience for everyone, particularly teenagers, and are reviewing the Coroner's report. We agree regulation is needed and we've already been working on many of the recommendations outlined in this report, including new parental supervision tools that let parents see who their teens follow and limit the amount of time they spend on Instagram.

"We also automatically set teens' accounts to private when they join, nudge them towards different content if they've been scrolling on the same topic for some time and have controls designed to limit the types of content teens see. We don't allow content that promotes suicide or self-harm, and we find 98 percent of the content we take action on before it's reported to us."

We have asked Twitter for comment.

Online Safety Bill

The news comes as the UK's Online Safety Bill continues through the House of Commons, legislation that famously makes unpopular calls for age verification – and, even more controversially, is looking once again to weaken encryption.

The bill, first drafted under former prime minister Theresa May in 2019, was revised this year to authorize imprisonment — up to two years — for anyone whose social media message could cause "psychological harm amounting to at least serious distress."

Most of the tech community as well as rights orgs like the EFF, find it a blunt instrument that attacks both free speech and the security of the internet, particularly as the amended bill, like the Earn IT bill in the US, proposes the much-derided device-side scanning. Critics say it's asking the tech to do the police's jobs for them, and taking away basic rights not to have one's private communications snooped on – and the latest in a series of attempts by government to break encryption.

Jo Joyce, senior counsel in Taylor Wessing's commercial tech & data team, said earlier this month that both legislating for and regulating digital content viewed by children while respecting free speech for adults was a challenge.

"The news of the return of the Online Safety Bill to Parliament didn't surprise many commentators. The protection of children and vulnerable people online is perceived as a vote winner for the government and dropping the Bill entirely was unlikely to be an option, despite the concerns of free speech advocates and pressure from tech businesses."

The bill's passage has been paused while the UK considers whether its new prime minister will have a shelf life exceeding that of a lettuce, but comms regulator Ofcom put out research it commissioned last week showing the way it perhaps hopes things might go – with age verification very much back in play. The research stated that "a third of children have false social media age of 18+."

Age verification on the internet was first proposed in the UK's 2017 Digital Economy Act, and later abandoned in 2019, with security and privacy experts concerned about the collation of private data necessitated by making such checks. Among the proposals was signing up with one's credit card – a deeply unpopular idea – and allowing certain firms to work as information collaters / age verification service providers, creating huge jackpot targets of citizen data.

British computer security expert Ross Anderson, of the University of Cambridge's compsci department, argued last week that proposed legislation around child safety and policing issues, including the Online Safety Bill, led to "magical thinking."

He said: "Effective policing, particularly of crimes embedded in wicked social problems, must be locally led and involve multiple stakeholders; the idea of using 'artificial intelligence' to replace police officers, social workers and teachers is just the sort of magical thinking that leads to bad policy."

A poll of tech professionals grilled by BCS The Chartered Institute for IT found just 14 percent of techies believed the legislation was "fit for purpose," with most (58 percent) saying it would "have a negative effect on freedom of speech."

And as for part 3 of the amended bill, which asks platforms to "develop or source" accurate technology to detect abuse material, just one quarter (25 percent) of techies felt it was realistic for Ofcom to ask; 57 percent said it was not.

Rob Deri, chief exec of BCS, noted earlier this year: "The technology itself has an important part to play in keeping people safe on social media platforms. However, the Bill leans too heavily on tech solutions to prevent undesirable content, which can't be relied upon to do that well enough and could affect freedom of speech and privacy in ways that are unacceptable in a democratic society.

"The legislation should also focus on substantive programmes of digital education and advice so that young people and their parents can confidently navigate the risks of social media throughout their lives."

Tech sector would need to buy in

A particularly controversial aspect of the original bill was its aim to curb what it calls "legal but harmful" expression by netizens.

Taylor Wessing's Joyce said of this: "The attempt in the original Bill to regulate a category of harmful (but not illegal) speech for all individuals was highly controversial and practically challenging.

"It seems as if the attempt to regulate harmful but legal speech will be repeated in relation to children, but not adults. We will have to wait for the revised draft to see whether the government can find a formula that respects free speech for adults while regulating digital content viewed by children.

"This is a huge challenge, not only for those drafting the legislation, but for enforcement or regulation in this area. Detailed guidance is going to be required and without significant buy-in and engagement on self-regulation from the tech sector, the impact of the legislation may be very limited."

Putting aside age verification and other interventionist legislation that damages privacy and security on the internet, and considering only the coroner's "separate platforms" suggestion, it could be argued that children and teens would not want to visit a child-only platform and might still attempt to break into the adult versions. What do you think?

We're curious how Reg reader parents direct their children and teenagers in their internet use, and whether they use parental control apps, ban devices, or simply supply their kids with feature phones with no internet access. You can weigh in with your opinion in the comments below. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like