This article is more than 1 year old
WhatsApp pulls plug on Taliban helpline, shuts down official-looking accounts
Terrorists' complaint service a bridge too far for encrypted chat biz
For months, Facebook's WhatsApp paid no attention to the way the Taliban used the messaging service to sell surrender to the people of Afghanistan.
After reestablishing control of Afghanistan with minimal armed resistance, the Taliban, a Specially Designated Global Terrorist entity since 2002, took over the capital city of Kabul on Sunday and set up a helpline to allow civilians to report problems, such as looting and violence, during the regime change.
Finally, WhatsApp has decided to take action – it has shut down the complaint service, along with other Taliban communication channels, now that the Taliban are the de facto government of the country. What was the nation's government has collapsed, and Afghan President Ashraf Ghani reportedly fled in a helicopter packed with cash.
"We're obligated to adhere to US sanctions laws," explained a WhatsApp spokesperson in response to an inquiry from The Register.
We're seeking more information from relevant US authorities given the evolving situation in Afghanistan
"This includes banning accounts that appear to represent themselves as official accounts of the Taliban. We're seeking more information from relevant US authorities given the evolving situation in Afghanistan."
Facebook has maintained a ban on the Taliban for years, and said it was proactively removing content linked to the Taliban as the women-enslaving military organization seized control of Afghanistan this month. Yet, WhatsApp appears to have been widely used by the militant group. In response to other reports about the situation in Afghanistan, WhatsApp suggested that its use of end-to-end message encryption has limited visibility into what sanctioned entities or individuals are doing on its platform.
However, WhatsApp's significance in Afghanistan in this instance may have more to do with its reach among the Afghan people than with the content moderation challenges of encrypted content. Had the Taliban been communicating only among themselves, their surrender sales pitch would not reach the intended audience.
Meanwhile, TikTok has said it will continue to remove Taliban content while Twitter and YouTube reportedly plan to rely on their existing platform rules to guide content enforcement decisions.
A question of responsibility
In a blog post on Sunday, Preston Byrne, a partner at law firm Anderson Kill, argued that Facebook and the US government are responsible for failing to silence Taliban messaging when doing so could have made a difference.
"WhatsApp is an American product," he wrote. "It can be switched off by its parent, Facebook, Inc, at any time and for any reason. The fact that the Taliban were able to use it at all, quite apart from the fact that they continue to use it to coordinate their activities even now as American citizens’ lives are imperiled by the Taliban advance which is being coordinated on that app, suggests that US military intelligence never bothered to monitor Taliban numbers and never bothered to ask Facebook to ban them."
It's not clear whether earlier intervention by WhatApp would have changed anything – the obstacles to US success in Afghanistan date back decades and are orders of magnitude greater than any single communications channel.
- Once again, Facebook champions privacy ... of its algorithms: Independent probe into Instagram shut down
- Tencent suspends signups to WeChat, citing 'security upgrade' and need to comply with Chinese laws
- Er, no, we would like to continue suing Facebook, US state AGs tell courts
- Perhaps regretting those Instagram, WhatsApp acquisitions, UK watchdog suggests Facebook offloads GIF haven Giphy
But concerns about the destabilizing effect of social media also go back a long way too. Historically, Facebook and its subsidiaries have been slow to publicly acknowledge the ways in which social media platforms can be used to sway public opinion.
In 2018, WhatsApp took steps to limit message forwarding following criticism that the social media service helped stoke violence in Myanmar and India. That same year, Facebook banned 20 organizations and individuals from its service after a UN report criticized the company for failing to prevent violent rhetoric on its platform.
In 2016, Facebook CEO Mark Zuckerberg said the idea that fake news influenced the 2016 US presidential election was "a pretty crazy idea," only to backtrack 10 months later.
Perhaps it's too much to hope that social media platforms will learn to anticipate these problems instead of reacting after the fact or dismissing them as lunacy. ®