One whole day: That's how long Facebook's COVID-19 content moderation went without a mess
AI flagged good content as bad content – and it took the founder of Godwin's Law to point it out
One whole day after telling the world it was going to do its very best to ensure that only high-quality COVID-19 content from proper sources would spread on Facebook, The Social Network has mistakenly identified just such content as violating its community standards.
This one seemingly started with Mike Godwin, a US-based lawyer and activist who coined Godwin’s Law: “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.”
Godwin was footling on Facebook and tried to share a story titled Updated every minute, 17-year-old whiz kid’s coronavirus site used by millions from the Times of Israel.
The story has been very widely read as it details a minute-by-minute account of the global COVID-19 pandemic.
But when Godwin tried to share it on Facebook, here’s what happened.
Facebook decided that my posting of this Times of Israel article is spam. (It’s not spam.) pic.twitter.com/3NqUbiwmyi— Mike Godwin (@sfmnemonic) March 17, 2020
Other Facebook users have reported similar issues.
Enter Alex Stamos, formerly Facebook’s chief security officer and now an infowar researcher at Stanford.
It looks like an anti-spam rule at FB is going haywire. Facebook sent home content moderators yesterday, who generally can't WFH due to privacy commitments the company has made. We might be seeing the start of the ML going nuts with less human oversight. https://t.co/XCSz405wtR— Alex Stamos (@alexstamos) March 17, 2020
Next up: Guy Rosen, Facebook’s VP of integrity, who tweeted “We're on this - this is a bug in an anti-spam system, unrelated to any changes in our content moderator workforce. We're in the process of fixing and bringing all these posts back. More soon.”
That prediction was accurate: less than two hours after the above exchanges, Rosen was back with the following.
We’ve restored all the posts that were incorrectly removed, which included posts on all topics - not just those related to COVID-19. This was an issue with an automated system that removes links to abusive websites, but incorrectly removed a lot of other posts too.— Guy Rosen (@guyro) March 18, 2020
This one could spawn a thousand utterly unoriginal and/or bleeding obvious LinkedIn posts, because Facebook has sent its moderation workforce to work from home and increased its use of automated filtering. And as Rosen admitted, the bots flubbed the job. ®