Does UK's Online Safety Act cover misinformation? Well, that depends

Minister, platform providers disagree on whether law would have helped avoid last summer's riots

MPs heard a range of interpretations of UK law when it comes to the spread of misinformation online, a critical factor in the riots across England and Northern Ireland sparked by inaccurate social media posts about the fatal stabbings at a children's dance class on 29 July last year.

Southport merseyside UK July 30th 2024, protestor stands near burning barricade facing riot police with brick in hand. Southport riot.

Southport, Merseyside, UK, July 30th 2024: A protestor stands near burning barricade facing riot police with brick in hand (click to enlarge) – Editorial credit: Ian Hamlett / Shutterstock.com

Baroness Jones of Whitchurch, Parliamentary under-secretary of state at the Department for Science, Innovation and Technology (DSIT) and the Department for Business and Trade, said misinformation and disinformation were covered by the Online Safety Act, which came into force in the UK on March 17, 2025.

The minister was questioned during an April 29 hearing of the House of Commons Science, Innovation and Technology Committee, which was concerned about how the newly implemented legislation might have prevented the spread of violence in the UK last summer.

Riots broke out after three children were killed at a party in Southport, northwest England, in July. They followed false claims that the perpetrator was a Muslim and an asylum seeker, which, along with racist and anti-immigrant hatred, were spread on social media. Rioters targeted mosques, accommodation for asylum seekers, and stores they believed to be Muslim-owned over a stretch of days as far right groups capitalized on racially motivated thuggery. Violence and looting went on for days.

Speaking about the riots, MP and committee member Steve Race pointed out that the tech platforms had said in hearings that even if the Online Safety Act had been fully enacted at the time, it wouldn't have changed their response to events following the Southport attacks.

Committee chair Chi Onwurah, a Labour MP, said there were no duties on communications regulator Ofcom to act with regard to misinformation, even if there were codes which do talk about misinformation and risks. "That seems to be a key issue," she said.

Mark Bunting, online safety strategy delivery director at Ofcom, said the previous government had decided to remove legal material that might be harmful to adults from the scope of the of the Act, including forms of misinformation. However, he added Ofcom was still working on the subject and had announced membership of its Information Advisory Committee.

He later added that while it was "correct" to say the Act does not cover misinformation there was "one small caveat, which is that [it] did introduce the new offence of false communications with an intent to cause harm, and where companies have reasonable grounds to infer that there is intent to cause harm."

Onwurah pointed out that intent in this context was very difficult to prove.

Baroness Jones said that if events similar to last summer's riots were to take place again, the illegal harms element of the Act would now apply. "I think that is the material difference. Our interpretation of the Act is misinformation and disinformation [are] covered under the illegal harms code and the children's code," she told the Committee.

Government guidance published along with the Act said: "Mis- and disinformation will be captured by the Online Safety Act where it is illegal or harmful to children. Services will be required to take steps to remove illegal disinformation content if they become aware of it on their services."

Civil servant Talitha Rowland, director for security and online harm at DSIT, told the committee that "one of the challenges of this area is mis- [and] disinformation isn't one thing. It can sometimes be illegal. It can be foreign interference. It can be content that incites hate or violence that's clearly illegal. It can also be below the illegal threshold, but nevertheless be harmful to children: that is captured."

She added that misinformation was also captured by many of the largest services to terms of service that the Act requires them to enforce consistently. "Saying [that] platforms told you that they wouldn't have necessarily done anything different: that at the moment is them marking their own homework. They will have to account to Ofcom as to whether they are actually doing those things, not be able to make that assessment and judgment for themselves," she said.

Bunting added that there was a lack of case law showing how the Act might be interpreted on the point of misinformation. ®

More about

TIP US OFF

Send us news


Other stories you might like