Facebook has admitted it was "far too slow" to recognize that its systems were being used to "spread misinformation and corrode democracy."
In a blog post today by its manager of civic engagement Samidh Chakrabarti, the social media giant appears to have become self-aware following a year in which countless researchers, journalists and lawmakers have tried to get it to wake from its reverie and recognize how the company's vast publishing platform is being constantly misused – and that it was in part responsible.
Clickbait fake news, disinformation from Russia to divide America that reached more than 120 million people, and similar crap, have plagued Facebook for years. Now some humble pie has been purchased, microwaved, and nibbled on.
"While I'm an optimist at heart, I'm not blind to the damage that the internet can do to even a well-functioning democracy," Chakrabarti wrote in a missive that reminded us of the comedy sketch by David Mitchell and Robert Webb in which two members of the Nazi SS discuss the uncomfortable fact that their caps are emblazoned with skulls.
"Have you noticed our caps have actually got little pictures of skulls on them?" asks Erich. Hans is unsure: "Er... I don't, erm... " Erich: "Hans... are we the baddies?"
Following a post earlier this month from Facebook CEO Mark Zuckerberg in which he acknowledged that the Silicon Valley titan needed to spend more time "protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent," Facebook underlings have seemingly been granted permission to question their own mythical view of themselves.
Chakrabarti does a decent job admitting fault. Faceboook is "being used in unforeseen ways with societal repercussions that were never anticipated," he admits, accepting that "we at Facebook were far too slow to recognize how bad actors were abusing our platform."
But this being Silicon Valley, the pessimism isn't allowed to linger longer than one sentence before he declares that the biz is "working diligently to neutralize these risks now." And, as the post progresses, it becomes clear that Facebook is all too ready to slip back into self-delusion and look to everyone but itself for fixes.
"We can't do this alone," warns Chakrabarti, "which is why we want to initiate an open conversation on the hard questions this work raises."
It is a Facebook conversation, of course, which means that Facebook will tell everyone what a great job it is doing, and you are invited to agree with it. There is also no question as to where this will all end up: Facebook being even better than ever.
"In this post, I'll share how we are thinking about confronting the most consequential downsides of social media on democracy, and also discuss how we're working to amplify the positive ways it can strengthen democracy, too," he said.
The post revealed that the company is at least finally being honest about the scale of the problem. "Although we didn't know it at the time, we discovered that these Russian actors created 80,000 posts that reached around 126 million people in the US over a two-year period," the post noted. "This was a new kind of threat that we couldn't easily predict, but we should have done better."
It identified the promotion of "inauthentic Pages" as the key tool. But its solution – requiring organizations that run "election-related ads" to confirm their identities – ignored the fact that it wasn't really ads that caused the problem. And it failed to define how broad the definition of "election-related" goes.
Chakrabarti also said that election-related ads will be archived and made searchable – which is a step forward in transparency but arguably not by much, and not by enough. Facebook was extremely resistant to releasing any information on Russian-sponsored ads and the stats surrounding them, causing congressfolk to publicly criticize the company.