Big Tech has told the UK government it will do more to remove extremist content from their networks, but has refused to offer concessions on encryption.
Following a meeting between Britain's Home Secretary Amber Rudd and communication service providers, called in the aftermath of the murders in Westminster, senior executives from Facebook, Google, Microsoft and Twitter put out a joint statement.
"Our companies are committed to making our platforms a hostile space for those who seek to do harm and we have been working on this issue for several years," the statement reads, adding: "We share the Government's commitment to ensuring terrorists do not have a voice online."
In order to do that, the companies said they would "look at all options for structuring a forum to accelerate and strengthen this work."
The letter outlines three ways to do that:
- Improve automatic tools to remove extremist content.
- Help other companies to do the same.
- Support efforts from "civil society organizations" to "promote alternative and counter-narratives."
The statement is more notable for its omissions than its promises, however. There is no mention of timelines either on taking down such content, or on taking action. There is no promise to remove such content. There is no offer of firm resources. And the only actual project referred to is the "innovative video hash sharing database that is currently operational in a small number of companies."
Crucially, there is no mention at all of the other pressing issue – encryption.
Two days after the attack, Amber Rudd made headlines by arguing that the authorities must have access to the communications of the attacker – Khalid Masood/Adrian Ajao – and specifically highlighted Facebook-owned chat app WhatsApp that she said Masood had used on the day of the attacks.
The Home Office put out its own short statement following the meeting in which it also glossed over the encryption issue, noting that the meeting "focused on the issue of access to terrorist propaganda online."
Rudd said she "welcomes the commitment from the key players to set up a cross-industry forum," but pointedly notes that she would "like to see the industry go further and faster in not only removing online terrorist content but stopping it going up in the first place."
Another recent critic of tech companies on this topic, chair of the Home Affairs Select Committee Yvette Cooper, called the outcome "a bit lame."
"All the Government and social media companies appear to have agreed is to discuss options for a possible forum in order to have more discussions," Cooper complained. "Having meetings about meetings just isn't good enough."
Social media companies in particular are under fire in Europe over the ready availability of extremist material and the apparent ease with which extremists communicate among themselves and with others on systems run by large Western corporations.
The issue is complicated by the fact that most of those corporations are based in the United States and so have a strong belief that removing or even blocking content is tantamount to censorship and breaks the First Amendment.
Europe takes a different approach to what constitutes fair or free speech and has threatened to introduce legislation obliging social media companies to remove extremist content or face large fines and lawsuits. ®