How it works
All we do know is that this particular fake video was removed manually by YouTube, and that it found its way to the top of its trending list through an automated algorithm. Beyond that, despite repeat requests, YouTube has refused to give any insights into its systems' failure, even going so far as to obfuscate the issue.
Asked how videos appear on the trending tab, the company said it looks at "several factors" which "include" view count, rate of growth of views and age of the video. That would point to an organized effort by a large number of fake user accounts to click on this video, and similar videos promoting the same and related conspiracies.
Just last week, YouTube CEO Susan Wojcicki mocked competitor Facebook and its plans to encourage users to post more video content. On stage at a media conference, Wojcicki said that the social media giant "should get back to baby pictures and sharing."
Despite refusing to give any information about how it intends to combat such abuse now and in future, there are indicators that YouTube is willing to take a stronger approach than in the past.
One YouTuber with more than 150,000 subscribers, David Seaman, posted a video Wednesday complaining that the company has removed all ads from his 500+ videos after he started posting about David Hogg on his channel.
While YouTube appears unable to prevent its systems from being gamed, it is clearly trying to remove the financial incentive for some users to weigh in on controversial topics or to post offensive content.
Last month, the company took similar action against one of its most-followed users, Logan Paul, who boasts 16 million followers. Paul was removed from the company's "Google preferred" program and plans to pay him to create original content were put on hold after an outcry over a video in which he mocked suicide victims while standing next to a recently deceased corpse.
Why aren't you being arbiters of truth? MPs scream at Facebook, YouTube, TwitterREAD MORE
When Paul subsequently posted another offensive video, YouTube responded by "demonetizing" his videos i.e. take off ads and so preventing him from receiving any money from his videos.
This new policy was explained in a blog post by the company's VP of product management Ariel Bardin, who complained about "the egregious actions of a handful of YouTubers" and said that in such cases in future it would "suspend a channel’s ability to serve ads" and "may remove a channel’s eligibility to be recommended on YouTube, such as appearing on our home page, trending tab or watch next."
He went on to note that "in the past, we felt our responses to some of these situations were slow and didn't always address our broader community’s concerns."
Despite its determined effort to only apply controls after the fact, and keep human intervention to an absolute minimum, YouTube is going to have to reflect on the fact that its service was used to falsely slander a teenager who has been recently traumatized by a mass shooting at their school, and consider the fact that its response to this situation was to promote the effort in its most high-profile slot.
At some point, YouTube has to stop hiding behind its automated algorithm argument and adopt some degree of responsibility for what appears on its platform and the role it plays in highlighting its worst examples. ®