Comment The Times is campaigning to brand Facebook a "publisher" under British law. While an understandable reaction to the horrible content shared by users of the world's most popular social networking website, trying to make it subject to publishing laws would open a whole Pandora's box of trouble.
Yesterday the newspaper published a story claiming that Facebook could face criminal liability for "publishing child pornography". It wheels out a QC who claims that the social network's failure to delete material after it is reported to moderators could break various UK laws.
Fine. Nobody disputes that something needs to be done to better police content on Facebook. Better moderators and a harsher approach to taking down obviously illegal content would be a great start.
Yet the Times campaign to depict Facebook as a publisher is wrong in law. The very question of whether website operators can be held legally liable for things posted by others on their websites was settled just four years ago with the Defamation Act 2013. Among other things, the act granted website operators civil immunity from being sued over what website users post – typically in comments sections.
That law was passed mainly to protect news websites. If someone does write something libellous in the comments section, the website operator must either delete the comment or pass on the commenter's contact details (that they used to register with the site) so they can be pursued by the irritated party. Helpfully, the way the act was written means this covers all websites – including Facebook.
Now, none of that covers criminal liability; all bets are off if people are posting indecent images of children or jihadi training material on your internet forum and you're not stopping them. This is where The Times' rather muddled campaign quite rightly hits the mark.
To make Facebook (and Twitter, and other social networks) liable for users' content would almost certainly lead to the Defamation Act 2013 being substantially amended to remove that important protection. That would have a chilling effect on free speech – ironically, the very effect the act was passed to stop. Moderators, faced with criminal sanction for getting it wrong, would default to being as restrictive as possible through simple self-interest. Given the millions of discussions Facebook users engage in on any and all of life's topics, people would soon start shouting about being censored – and quite rightly so.
Anyone who has actually done any comment moderation will know that certain types abuse "report to moderator" buttons when they're losing an argument, in the hope that selective deletions will leave them looking good. (It does happen here on El Reg, more than you'd think. We manually read each and every comment reported to moderators, and take a look at the context before deciding whether to delete it. Quite often we do; sometimes we don't.)
Instead of targeting Facebook with new laws, as The Times would, we should instead target those who misuse the platform to promote illegal things. The newspaper quite rightly reported videos of a "young child being violently abused" to the police – but appears to have tried to turn this against Facebook (judging by the line "A Met spokesman did not say whether Facebook would itself be investigated") instead of insisting that the abusers who made and posted the videos be found and prosecuted.
While the British establishment continues to demonise Facebook – partially correctly – we should not bow to its demands to legislate against social networks. For sure, harsh pressure must be applied to ensure Facebook's content moderation improves, and quickly. Passing new laws specifically targeting Facebook, or making it liable for the actions of others, is not the answer. ®
Gareth Corfield was a Reg sub-editor, taking personal liability for all of our stories during a temporary stint as chief sub-editor, before adding years to his life by changing job and becoming a reporter. Behind the scenes, he specialises in media law.