The European Court of Justice (CJEU) on Thursday ruled that Facebook update its filters and allow member states to remove content that's been deemed illegal, not only for Facebook users in the plaintiff's country but everywhere.
The decision follows a complaint by Eva Glawischnig-Piesczek, the leader of Austria's Green Party, over a post she claimed was defamatory that was made to her Facebook account. When Facebook refused to remove the offending post, she took her case to court in Austria and won a preliminary injunction covering the offending post and similar ones. The case went to the Austrian Supreme Court, which asked the CJEU, Europe's highest court, for clarification on the scope of EU law when it comes to takedown orders.
The Austrian Supreme Court asked the CJEU to determine whether it could demand to have unlawful content blocked worldwide and the CJEU said that's fine so long as it's done "within the framework of the relevant international law."
Dan Jerker Svantesson, a law professor at Bond University in Australia, doesn't find this particularly reassuring. "The problem is, however, that the framework of the relevant international law is like the combination of swiss cheese and blue cheese – it is full of holes and what is there stinks," he observed in a post to LinkedIn.
The decision doesn't create a pre-emptive monitoring requirement for online platforms, though Christian Twigg-Flesner, professor of international commercial law at Warwick University in the UK, speculates that could be an eventual outcome.
"It could mean that [internet] platforms might be required to take steps to monitor their content, particularly content provided by third-party suppliers, for information ruled illegal, eg, under consumer protection legislation," he wrote in a blog post on Thursday.
At the very least, the ruling shrinks the Safe Harbor under which Facebook and other online content platforms can escape liability if they demonstrate that they're responsive to legal demands.
In a series of Twitter posts, Daphne Keller, intermediary liability director at the US-based Stanford Center for Internet & Society, described the outcome as "basically the worst case scenario."
"EU Member State courts can order platforms to use automated filters to block 'identical' or 'equivalent' uploads," Keller said. "Orders can have global effect and suppress legal expression in other countries, unless Austrian courts find some other reason in national law to stay their hand."
The problem is that filtering technology may not work as anticipated and may cause collateral damage to free expression. As advocacy groups have told European lawmakers, filters often fail to understand context, pointing to YouTube's deletion of 100,000 videos from it identified as terrorist content that were part of the Syrian Archive, an advocacy group focused on documenting human rights abuses in Syria.
"It’s open season for courts to mandate made-up technology without knowing what that technology will do," lamented Keller. "And those mandates can apply to the whole world."
In a paper Keller penned earlier this year after Europe's Advocate General issued an opinion to guide the CJEU decision, she noted that filtering technology performs poorly. "One recent study of 'natural language processing' filters found errors in one out of every four to five takedown decisions, and noted that the errors increased when speakers used slang, sarcasm, or languages that tech company employees didn’t speak," she observed. "Another found that automated detection tools disproportionately mis-identified social media posts in African American English as 'toxic.'"
EU's top court says tracking cookies require actual consent before scarfing down user dataREAD MORE
In an email, a Facebook spokesperson said, "This judgement raises critical questions around freedom of expression and the role that internet companies should play in monitoring, interpreting and removing speech that might be illegal in any particular country."
Facebook's spokesperson said the ruling goes beyond the site's Community Standards and content policing processes.
"It undermines the long-standing principle that one country does not have the right to impose its laws on speech on another country," the company's spokesperson said. "It also opens the door to obligations being imposed on internet companies to proactively monitor content and then interpret if it is 'equivalent' to content that has been found to be illegal."
"In order to get this right national courts will have to set out very clear definitions on what 'identical' and 'equivalent' means in practice. We hope the courts take a proportionate and measured approach, to avoid having a chilling effect on freedom of expression." ®