This article is more than 1 year old

Google, Apple sued for failing to give Telegram chat app the Parler put-down treatment

Messaging wunderkind said to be haven for extremists – and Silicon Valley gave it a pass, ex-ambassador complains

Marc Ginsberg, a former US ambassador who oversees a non-profit called Coalition for a Safer Web (CSW), sued Alphabet's Google subsidiary on Monday for failing to remove the Telegram Messenger app from its Google Play store.

The complaint [PDF], filed in the Northern California District Court, claims that Telegram is rife with hate content but Google has failed to take action against the app as it did with Parler. Parler saw its Android and iOS apps removed from their respective stores and lost its web hosting for failing to moderate insurrectionist content earlier this month.

CSW filed a similar lawsuit against Apple for its failure to de-platform Telegram Messenger on January 17, 2021.

The advocacy group has asked Google to take action several times over the past year. In July, 2020, the group asked Alphabet CEO Sundar Pichai to suspend Telegram from Google Play due to its extremist content.

"For years, anti-black and anti-Semitic groups have openly utilized Telegram with little or no content moderation by Telegram’s management," the complaint says.

trump

Trump's gone quiet, Parler nuked, Twitter protest never happened: There's an eerie calm – but at what cost?

READ MORE

"Despite warnings from CSW and other organizations, extensive media coverage, legal warnings, and other attention that Google is providing an online social media platform and communication service to hate groups, Google has not taken any action against Telegram comparable to the action it has taken against Parler to compel Telegram to improve its content moderation policies."

The Google Play Store has policies against hate speech and other sorts of disallowed content, the complaint says, and it claims Google isn't applying those policies fairly, in violation of California's Unfair Competition Law.

The court filing goes on to document multiple examples of hate speech and extremist content that continue to circulate on Telegram. And it claims hate groups "relied on Telegram as an essential tool to facilitate and carry out their terrorist activity, including the United States Capital attack on January 6, 2021."

Ginsberg made similar points in his testimony [PDF] before the House Energy & Commerce Subcommittee on Consumer Protection & Commerce in September, 2020, in which he summarized the ongoing failure of social media companies to deal with extremist content.

In his testimony, Ginsberg challenged the Section 230 immunity that protects internet service providers from liability for user-generated content, asserting "that social media companies have become de facto publishers by taking the editor’s road they have embarked upon to subjectively decide all manner of content visibility or invisibility." His argument is that companies cannot selectively exercise editorial control while also enjoying the immunity from liability granted for not making editorial decisions.

The lawsuit also claims emotional distress, based on the fact that Ginsberg is Jewish and has legitimate fears about being targeted due to his ethnic and religious identity – he is said to have been the target of two assassination attempts.

"By continuing to host Telegram on the Google Play Store, [Google] facilitates religious threats against him and his family that has caused Ambassador Ginsberg to fear for his life," the complaint says.

Neither Google nor Telegram immediately responded to requests for comment.

Fool's errand?

In a phone interview with The Register, Eric Goldman, professor of law at Santa Clara University School of Law, said he doubts the court will even get to the Section 230 aspect of this case because it turns on a Tort claim (emotional distress) and two California statute competition claims. He expressed doubt that the case will get very far.

"I don't think the case reaches the merits of those laws," he said. "I don't think they properly claimed the violations the law recognizes."

But if the case goes forward, Goldman said he expects Section 230 of the Communications Decency Act and First Amendment of the US Constitution would offer Google substantial protection.

"I think we should be concerned that this plaintiff is asking for the unilateral right to veto all Telegram conversations," he said, noting that the collateral damage of such an outcome would be enormous.

"The lawsuit shows how app stores can never please everybody," Goldman explained. "People have been upset about Parler's removal and now a plaintiff is upset because an app hasn't been removed."

The fundamental problem, he said, is that you can't moderate content in a way that keeps everyone happy. Goldman said it would be better if we stopped looking for the optimal outcome and started looking for the least worst option.

"The least worst approach is to let app stores make decisions they think are right for their communities," he said, arguing that's better than forcing app stores to carry apps they think are terrible.

At the same time, lawsuits of this sort, even if they fail in court, can still influence how companies approach content moderation. Maybe that's the point. ®

More about

TIP US OFF

Send us news


Other stories you might like