Internet giants Google, Facebook and a wide range of organizations from Pinterest to Kickstarter to Wikimedia have responded furiously to a recent decision by the US Ninth Circuit Court of Appeals that could have huge liability implications for online companies.
"This has already created tremendous uncertainty in the online community," an amicus brief [PDF] to the case filed this week reads, arguing that it may "compel some providers to scale back or abandon beneficial efforts to prevent illegal or offensive content from being posted."
It wants the Ninth Circuit to rehear the case en banc, meaning that 11 judges rather than just three will go over it a second time.
Referring to the "safe harbor" provision of the Digital Millennium Copyright Act (DMCA) – which means internet companies are not liable for what their users post online using their services so long as they respond to takedown requests from copyright holders – Google, Facebook et al argue: "Never before had such prescreening activities been invoked as a possible basis for categorically evicting a provider from the safe harbor."
And they claim that ruling poses a "considerable threat ... to the stability of the DMCA regime and to the quality of content on the Internet."
As we reported earlier this month, the San Francisco-based court sided with celebrity photographer Mavrix in its case against LiveJournal and effectively ruled that since one of LiveJournal's websites actively screened content, it could be held liable for infringing copyright on obviously protected photographs.
The "Oh no, they didn't!" news site receives user-submitted content but it is only posted after active approval by moderators (most of whom are unpaid) working to set guidelines.
Mavrix says that on numerous occasions its photos – complete with visible watermarks – appeared on the site and it argued, seemingly successfully, that the site knew they were infringing copyright but were using the "safe harbor" protection under the DMCA to turn a blind eye.
While internet companies rightly attribute the safe harbor provision as a critical factor in their success thanks to the fact they didn't have to constantly track and worry about user-generated content, the reality is that the Ninth Circuit has recognized that the approach may be out of date.
The ease with which billions of internet users can post content online has turned the DMCA takedown approach into somewhat of a joke. Google receives roughly three million such requests every day – and the situation is only getting worse.
Due to the impossible task of tracking their copyrighted works across the internet, some companies have resorted to automated software that throws out hundreds of thousands of variations of web addresses once a single copy of one piece of work has been located.
Google recently estimated that 99.95 per cent of the requests it receives don't correlate to an actual URL – they are automated efforts to pre-emptively prevent copyright infringement. Google, rightly, views this as abuse of the system; copyright holders don't feel they have any better options; everyone hates it.
The Ninth Circuit's approach – which it has referred back down to the lower court – is to try to introduce some additional factors into the equation that would force companies to consider whether material on their site is infringing copyright, rather than simply letting users post and deal with infringement only when contacted by copyright holders.
It asked the lower court to look at whether the LiveJournal site's moderators "acted as agents" of LiveJournal, and asks it to consider:
- Did it have "actual knowledge" that the pictures were infringing copyright?
- Or, did it have "red flag knowledge," meaning that it would have been obvious to a "reasonable person" that the pictures infringed copyright – and this is where the watermark will be a critical component.
This has the potential to cause a huge shift in approach and liability by internet companies – and they are not happy about it.
Aside from warning that the ruling could cause companies to simply abandon any efforts to screen content and open the flood doors to illegal content, Facebook, Google and pals argue that the Ninth Circuit has no business even considering the issue of screening content, since it is not included in the text of the DMCA.
"Whether the service provider screened such content for relevance, legality, or some other criteria prior to its being posted is irrelevant," the amicus brief argues. "Under the plain text of the statute, material submitted for posting by a user is still stored 'at the direction of' that user even when it goes through some review process before becoming accessible to the public."
Legally, the companies have a point here, but common sense dictates this is an unrealistic argument: you can simply pretend that moderation and content screening have no impact, but it plainly does.
The court noted that many companies already screen things like pornography to provide a better service, implying that the technical ability and systems are in place.
This was also not appreciated by the internet giants, who complain that there was little explanation or definition given and that "without further explanation, the panel kicked the case to the 'fact finder' to determine 'whether the moderators' acts were merely accessibility-enhancing activities or whether instead their extensive, manual, and substantive activities went beyond the automatic and limited manual activities we have approved as accessibility-enhancing'."
This approach "departs from the text of the statute" and "fails to offer any meaningful guidance to courts or service providers about how to ensure that their content-review efforts remain on the right side of the panel's hazy new line," complains the brief.
Irrelevant your honor
It also argues that the screening of content is not included in the relevant part of the DMCA dealing with safe harbor – another legally solid argument but one that is splitting hairs. Likewise the contention that the Ninth Circuit has no right to distinguish between "submitting" and "posting" content – because those words do not appear in the Act.
It is only on solid commonsense ground when it argues that Congress specifically decided with the DMCA that it did not want service providers to monitor user uploads for copyright-infringing content because it wanted the focus to be on things like pornography, violence and spam.
Again, however, we return to the fact that times have changed and that the technology may already be there to start clamping down on infringing content.
Besides, any change is not just bad for companies like Google and Facebook, they argue, it could also "degrade the quality of online services," harm users "who want to create and consume high-quality content" and even, apparently, "undermine the interests of copyright owners by thwarting efforts that providers might otherwise make to try to identify and block infringing material that users submit for posting." Yeah, we're not sure what that means, either.
The brief concludes: "Simply put, the panel's decision threatens to bring about an outcome that would make things worse for everyone who is supposed to benefit from the DMCA. This is not remotely what Congress intended or prescribed. And rehearing should be granted to prevent it."
As for the likelihood of the en banc appeal going forward, there are roughly 1,500 requests a year for en banc hearings, 50 of which move forward to votes and 15-25 of which are actually heard.
Given the potential impact of the decision and the size and profile of the applicants in the case, this may be one of the very few that get reheard. ®