Rude web trolls should NOT be jailed, warns prosecution chief

But thugs, stalkers will be hauled before the beak


Web trolls should not be hauled before the courts if their malicious tweets and Facebook updates are quickly deleted, the Director of Public Prosecutions advised today.

Not having many friends online and sticking to offensive banter should also keep Brits out the dock.

Keir Starmer QC published this morning his interim guidelines for crown prosecutors that demanded a more measured approach to tackling trolling on the internet. He warned last month that the courts could be choked by millions of cases if every single person who said something annoying online was charged under section 127 of the Communications Act.

The number of trolling prosecutions is on the rise, and in October a 20-year-old bloke was jailed for 12 weeks for sharing off-colour jokes about a missing five-year-old girl.

"These interim guidelines are intended to strike the right balance between freedom of expression and the need to uphold the criminal law," Starmer said this morning.

He added that people firing off messages that are "credible threats of violence, a targeted campaign of harassment against an individual or which breach court orders" should be prosecuted, although web posts that are deemed "grossly offensive" should not be pursued.

He said:

A prosecution is unlikely to be in the public interest if the communication is swiftly removed, blocked, not intended for a wide audience or not obviously beyond what could conceivably be tolerable or acceptable in a diverse society which upholds and respects freedom of expression.

The interim guidelines thus protect the individual from threats or targeted harassment while protecting the expression of unpopular or unfashionable opinion about serious or trivial matters, or banter or humour, even if distasteful to some and painful to those subjected to it.

Under the new guidelines, prosecutors must appreciate the difference between a threat and a "grossly offensive" message before trying individuals in the courts. They would need to carry out a "high threshold" test to determine whether a case should be heard, Starmer said.

"They should only proceed with cases involving such an offence where they are satisfied that the communication in question is more than offensive, shocking or disturbing; or satirical, iconoclastic or rude comment; or the expression of unpopular or unfashionable opinion about serious or trivial matters, or banter or humour, even if distasteful to some or painful to those subjected to it," he added.

The Association of Police Officers (ACPO) welcomed the guidelines with comms lead Chief Constable Andy Trotter describing Starmer's advice as "a common sense approach" that will help "support consistency from prosecutors and police".

The director's guidance - which does not change the current law - will now be subjected to public scrutiny for three months. ®

Narrower topics


Other stories you might like

  • SpaceX staff condemn Musk's behavior in open letter
    Well, it doesn't take a rocket scientist to see why

    A group of employees at SpaceX wrote an open letter to COO and president Gwynne Shotwell denouncing owner Elon Musk's public behavior and calling for the rocket company to "swiftly and explicitly separate itself" from his personal brand.

    The letter, which was acquired through anonymous SpaceX sources, calls Musk's recent behavior in the public sphere a source of distraction and embarrassment. Musk's tweets, the writers argue, are de facto company statements because "Elon is seen as the face of SpaceX."

    Musk's freewheeling tweets have landed him in hot water on multiple occasions – one incident even leaving him unable to tweet about Tesla without a lawyer's review and approval. 

    Continue reading
  • GPUs aren’t always your best bet, Twitter ML tests suggest
    Graphcore processor outperforms Nvidia rival in team's experiments

    GPUs are a powerful tool for machine-learning workloads, though they’re not necessarily the right tool for every AI job, according to Michael Bronstein, Twitter’s head of graph learning research.

    His team recently showed Graphcore’s AI hardware offered an “order of magnitude speedup when comparing a single IPU processor to an Nvidia A100 GPU,” in temporal graph network (TGN) models.

    “The choice of hardware for implementing Graph ML models is a crucial, yet often overlooked problem,” reads a joint article penned by Bronstein with Emanuele Rossi, an ML researcher at Twitter, and Daniel Justus, a researcher at Graphcore.

    Continue reading
  • Japan makes online insults a crime that can earn a year in jail
    Law will be reviewed after three years amid debate on free speech vs civility

    Japan has updated its penal code to make insulting people online a crime punishable by a year of incarceration.

    An amendment [PDF] that passed the House of Councillors (Japan's upper legislative chamber) on Monday spells out that insults designed to hurt the reader can now attract increased punishments.

    Supporters of the amended law cite the death of 22-year-old wrestler and reality TV personality Hana Kimura as a reason it was needed. On the day she passed away, Kimura shared images of self-harm and hateful comments she'd received on social media. Her death was later ruled a suicide.

    Continue reading
  • US Supreme Court puts Texas social media law on hold
    Justices Roberts, Kavanaugh, Barrett help halt enforcement of HB 20

    The US Supreme Court on Tuesday reinstated the suspension of Texas' social-media law HB 20 while litigation to have the legislation declared unconstitutional continues.

    The law, signed in September by Texas Governor Greg Abbott (R), and promptly opposed, forbids large social media companies from moderating lawful content based on a "viewpoint," such as "smoking cures cancer" or "vaccines are poison" or hateful theories of racial superiority. Its ostensible purpose is to prevent internet giants from discriminating against conservative social media posts, something that studies indicate is not happening.

    Those fighting the law – industry groups and advocacy organizations – say the rules would require large social media services such as Facebook and Twitter to distribute "lawful but awful" content – hate speech, misinformation, and other dubious material. They argue companies have a First Amendment right to exercise editorial discretion for the content distributed on their platforms.

    Continue reading

Biting the hand that feeds IT © 1998–2022