Stung by global criticism over murder videos on his sprawling web empire, Facebook CEO Mark Zuckerberg has promised to swell the ranks of his moderator army.
"Over the last few weeks, we've seen people hurting themselves and others on Facebook – either live or in video posted later," Zuckerberg wrote in an update to his personal Facebook page. "It's heartbreaking, and I've been reflecting on how we can do better for our community."
And his solution is to add 3,000 people to the existing 4,500 in its "community operations team" over the course of 2017, which would bring the company's total number of employees to around 20,000. That community team is tasked with reviewing posts, images and vids that are flagged up by Facebook users as potentially awful.
In addition to the extra eyeballs, Zuckerberg said the company was looking at how to make it easier for users to report videos and how to speed up the takedown process. And he said the company would "keep working with local community groups and law enforcement who are in the best position to help someone if they need it – either because they're about to harm themselves, or because they're in danger from someone else."
The decision to act comes following a global outcry after the Facebook Live video streaming service – which lets anyone with a smart phone instantly stream to Facebook through its app – was used to film several murders.
In one last month, a man in Cleveland in the United States videoed himself driving around the city promising to kill random strangers. He uploaded one video of himself shooting and killing an old man. Just a week later, a man in Thailand streamed himself hanging his 11-month-old daughter before killing himself.
These shocking incidents follow a number of live suicide videos and videos depicting violent and abusive behavior as well as terrorist incidents.
Time is of the essence
Although Facebook is clearly not to blame for people's behavior, the company has come under severe criticism from users, the media and politicians for failing to do enough and for acting too slowly. The video of the hanged baby, for example, remained online for more than 24 hours after it was first reported to Facebook.
Zuckerberg fell back on the social media company's stock response to criticism that it is not doing enough to police its service: its sheer popularity makes it difficult.
"We're working to make these videos easier to report so we can take the right action sooner ... we'll be adding 3,000 people to our community operations team ... to review the millions of reports we get every week, and improve the process for doing it quickly," he noted.
That excuse is quickly wearing thin however, with politicians publicly calling the company out on its enormous profits and asking why some of that money can't be spent on making the service safer.
On top of that the German government, the European Commission and now the UK government have made formal proposals to fine Facebook, Google, Twitter et al millions of dollars if they do not remove illegal content within a short timeframe.
And just this week, the UK government pushed the issue one step further by arguing that social media companies should be actively searching for such content and removing it – and threatened to get the UK police to do that job for them and invoice them for their time if they didn't.