This article is more than 1 year old
After London attack, UK gov lays into Facebook, Google for not killing extremist terror pages
PM's spokesman and foreign secretary wave warning finger
In the wake of a terror attack in the heart of London this week that left five dead, the UK government has turned its ire onto online companies – including Google and Facebook – for not doing enough to remove extremist webpages and other content from their services.
The two online giants "can and must do more," said the Prime Minister's spokesman on Friday. "Social media companies have a responsibility when it comes to making sure this material is not disseminated," he added.
Meanwhile UK foreign secretary Boris Johnson, who is currently in New York, noted that extremist publications are easily located online and are "corrupting and polluting" people. He also placed the blame on social networks: "They have got to look at the stuff that is going up on their sites. They have got to take steps to invigilate it and to take it down where they can."
Even the polarizing newspaper the Daily Mail got in on the action, directly blaming Google on its front page for the attack's deadly nature. "Yesterday it took the Mail two minutes on web to find a terror manual on how to use a car for mass murder," it ran with the huge headline: "GOOGLE, THE TERRORIST'S FRIEND..."
The Daily Mail's front page has, predictably, split the country – but there's no denying the pressure on online companies to do more about extremist content
All this criticism comes in the same month that the German government formally proposed fining such companies €50m if they failed to remove "obvious" criminal content within 24 hours, and the European Commission threatened more fines if they didn't change their terms and conditions to pull out legal escape clauses, such as requiring European citizens to sue them in California rather than in their own countries. The terms and conditions break European law, said an EC spokesman.
And if all that wasn't sufficient, earlier this week, before the attack in Westminster, the chair of an influential parliamentary committee, Yvette Cooper of the Commons home affairs committee, laid into Facebook, Google and Twitter for failing to deal effectively with extremist content, saying they were too slow and questioning whether they were making serious efforts to do anything.
Google has also been hit in the wallet in recent days, after a number of very large advertisers pulled their money when it was revealed that YouTube ads were being run alongside extremist content.
The list of companies seems to be growing by the day: AT&T, PepsiCo, General Motors, Johnson & Johnson, Starbucks, Walmart and Verizon are not even the full list.
Clearly, some kind of tipping point has been reached, especially with Europe continuing to suffer attacks in which their citizens are murdered while US companies insist that those countries have to, de facto, follow American free speech laws by not restricting content.
Google, Facebook and Twitter have all been furiously rewriting their policies in an effort to stem the anger. Whether those attempts – which largely comprise hiding or downgrading the visibility of illegal or offensive content rather than removing it – prove sufficient to calm clear European anger, it is hard to know right now.
But the fact that the most senior officials in the UK government have made a point of publicly criticizing social media companies just a day after such an attack does not bode well. ®