Time to talk solutions
We have collectively been in this same situation numerous times in recent years and, thanks to the internet, the fact that it took place in one of the more remote countries on the planet, is almost irrelevant.
It was certainly irrelevant to the shooter, who referenced and clearly followed internet posters on the other side of the world as if they were locals in his town.
The simple fact is that the availability of such content, unchecked, is driving the problem. As is the current default of requiring someone else to locate and complain about specific content before its removal.
All social media companies have, for years, explained patiently that the sheer quantity of user-provided content means it is impossible for them to act in a way that people call for. They put in place controls and when they don't work and the next scandal erupts, the companies apologize and assure everyone that are working hard on the problem and their systems are getting better.
But the truth is that they are not getting better. They may be removing more content but that is a self-confirming figure that does nothing to understand how widespread the problem is. If you introduce a new control to find more content, it will find more content, like dipping an ever-larger bucket into a polluted river. But the pollution still flows, and we still have no idea how wide or deep the river is.
Here's a certainty: the shooter checked that his Facebook livestream was running and was being broadcast online before he left his car armed with a semi-automatic weapon. If it wasn't working, if his battery had run out, or data had cut out, or connection to Facebook's servers broke, he likely would not have left the car but would have stayed inside tinkering with it until it did work.
If the shooter – and the ones that will come after him – couldn't be sure that his video would ever make its way to the public, that there was a very high likelihood that it would be flagged during the upload process and stopped – would he had been sufficiently driven to carry out his actions?
Does the murder of individuals become less appealing when you realize no one will see your actions, or know your name, except for the other prisoners in your high-security wing? All the evidence suggests that yes, it does. As awful as that is to contemplate.
This is, after all, a man who painted the names of others on the weapons he used against others – and then took photos of them and posted them online. Photos and videos that are being republished all over the internet, including on mainstream news outlets, leaving a trail of material for the next person to obsess over.
If those photos, and his manifesto, and his video had all fallen into a moderator black hole, never to emerge on the public internet, would the desire to inflict untold levels of pain and anguish on people in the real world slip? Again, the answer is, yes, it probably would.
Because the online, virtual world is the one he inhabited, where he fed his obsessions and fears and stoked his anger and hatred. Without recognition in that world, why bother? And if that material only ever existed on his phone, it would not and could not be shared and promoted and amplified.
Does this mean that, with actual proper moderation, there would be a short delay in people being able to see your cat fall off the couch, or your child do a funny dance, when you yourself post your own content on Facebook, YouTube et al? Will you have to wait a few minutes, or perhaps longer, before your 13 livestream viewers are allowed to watch you bake cookies, write code, or simply stare off into the sunset on vacation?
Yes, yes, you will. Sorry for the inconvenience. But on the plus side, we won't have videos of people being murdered in cold blood offered as entertainment on an unprecedented and unjustifiable scale. ®