This article is more than 1 year old

Can YouTube be held liable for pushing terror vids? Asking for a Supreme Court...

Will Section 230 immunity just be revoked? We can answer that

The US Supreme Court on Tuesday heard arguments in Gonzales et al. v. Google, a case likely to reshape the internet if it goes against the search ad giant.

Spoiler alert: This looks unlikely, based on the oral arguments, according to several legal experts. But further legal challenges await and the case is far from over.

The justices are trying to decide whether Google can be held liable under the Anti-Terrorism Act for apparently recommending terrorist videos on YouTube or whether Section 230 of the Communications Decency Act, which shields internet companies from liability for third-party content, protects Google.

More specifically, the court is considering whether Section 230(c)(I) "[immunizes] interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?"

47 U.S.C. § 230(c)(1) says, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

The petitioners in the case are the relatives of Nohemi Gonzalez, an American murdered in a November 2015 terrorist attack in Paris. The Islamic State of Iraq and Syria (ISIS) took responsibility for the attack and the family of Gonzalez wants Google to be held responsible for allowing terrorist videos on YouTube – which are not alleged to have directly motivated the attack – and for algorithmic recommendations that surfaced terrorist content.

An earlier district court hearing on the case found that Section 230 protects Google from being held liable and a divided US Ninth Circuit Court of Appeals affirmed that decision.

Unsurprisingly, Google and supporters in the tech industry would prefer not to be held accountable for recommendation code that surfaces dangerous or unlawful third-party content.

"Creating liability for platforms that use algorithms to rank and moderate content will ultimately force websites to over-moderate or take a hands-off approach to content," said Adam Kovacevich, CEO of tech industry lobbying group Chamber of Progress and former Google public policy executive, in a statement. "In both cases, that’s a worse online experience for users."

The US Justice Department, in its brief [PDF] argues that the appeals court decision should be vacated so that the case can be sent back to the district court. The DoJ wants that court to determine whether YouTube is liable under the Anti-Terrorism Act based on its conduct and communications about removing terrorist content, because the company's internal processes don't fall under Section 230.

The Justice Department argues that Section 230 does not apply when the online service provider contributes somehow to the content at issue.

"[W]hen an online service provider substantially adds or otherwise contributes to a third party’s information—such that the resulting content can fairly be deemed the joint product of the provider and that party—both may be viewed as 'information content providers' with respect to that content, and both may be held accountable even on claims that would treat the platform as the 'publisher or speaker' of that content."

But Google is not alleged to have made substantial contributions to terrorist videos in this case.

Santa Clara University School of Law professor Eric Goldman, who has written extensively about the importance of Section 230, said Google appears likely to prevail based on the discussion in court. In a press conference organized by the Chamber of Progress following the court session. He made similar comments in a blog post.

"I did not hear five votes in favor of the plaintiffs’ position," wrote Goldman. "Indeed, the justices didn’t really engage with the plaintiffs’ core arguments much after their initial dismantling, which I take as a sign of their lack of persuasiveness. For that reason, I have a little optimism that Google will win the votes – much more so than yesterday.

Goldman, however, during the press conference said that even if Google does prevail, there may be unintended consequences arising from the wording of the eventual Supreme Court decision if it ends up narrowing the applicability of Section 230.

Jess Miers, legal advocacy counsel at Chamber of Progress, said Section 230 is essential for the modern web and eliminating protections for non-neutral tools and algorithms would functionally eliminate social media and would harm marginalized speakers.

The Justices discussed "neutral" tools, rules, and algorithms. Goldman argues that is a misguided concern.

"Algorithms are never neutral and always discriminate," he wrote. "And tool 'neutrality' elides many questions about who made the tools and their normative agenda. No publisher ever wants to use 'neutral' tools because the mere act of publication is, by definition, not a neutral act."

Tomorrow, the US Supreme Court will hear a related case, Twitter, Inc. v. Taamneh, which also involves Section 230 and the Anti-terrorism Act. In that case, the relatives of a victim of an ISIS nightclub attack in Turkey in 2017 sued Facebook, Google, and Twitter alleging they knowingly aided the terrorist group by neglecting to prevent videos from being distributed.

Goldman said despite signs the Justices appear inclined to side with Google in the Gonzalez case, Section 230 remains highly imperiled. "Section 230 has become the target for all the pro-censorship impulses across both parties and a huge swath of American voters," he said. ®

More about


Send us news

Other stories you might like