This article is more than 1 year old
YouTube's recommendation engine is pretty naff, Mozilla study finds
It even pushes videos that break the site's own content policies
The majority of YouTube videos that netizens taking part in a study said they regretted watching were recommended by the website's space-age AI algorithms.
“This problem with YouTube’s recommendation algorithm is part of a bigger story about the opaque, mysterious influence that commercial algorithms can have on our lives,” the Mozilla-led study, released on Wednesday, concluded.
“YouTube’s algorithm drives an estimated 700 million hours of watch time every single day, and yet the public knows very little about how it works. We have no official ways of studying it.”
Firefox-maker Mozilla built a browser extension named RegretsReporter for YouTube users to download. Once installed, the extension logged netizens’ viewing activity on YouTube, recording details on the videos watched, and made it easy for users to flag clips that they wished they hadn’t seen. The data was pooled together and analyzed in an attempt to study YouTube's recommendation engine's behavior and effectiveness.
There are a number of reasons why the volunteers regretted watching vids, ranging from the material being nothing more than bonkers QAnon conspiracy theories about political elites drinking children’s blood, to false information on COVID-19 vaccines, to inappropriate raunchy parodies of smash-hit movie Toy Story.
Mozilla researchers found that 71 per cent of all videos volunteers regretted watching on the Google-owned platform were recommended by the giant's whiz-bang AI algorithm. They also estimated that about 12.2 per cent of those reported videos contained content that violated YouTube’s own guidelines and policies – the vids shouldn't be on the site all at let alone recommended by it.
A total of 37,380 YouTube viewers across 190 countries volunteered in the crowd-sourced study; 3,362 reports were submitted for videos they regretted watching between July 2020 and May 2021. YouTube’s recommendation algorithm seems to perform worse in non-English speaking countries, where people logged higher rates of regret reports. Brazil, Germany, and France were rated worst, the US and the UK were ranked eighth and sixteenth respectively.
“The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone,” a Google spokesperson told The Register in a statement.
“Over 80 billion pieces of information is used to help inform our systems, including survey responses from viewers on what they want to watch. We constantly work to improve the experience on YouTube and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content. Thanks to this change, consumption of borderline content that comes from our recommendations is now significantly below 1 per cent."
- Florida Man sues Facebook, Twitter, YouTube for account ban
- Euro court rules YouTube not automatically liable for users illegally uploading copyright-protected material
- Campaigners warn of an 'algorithm-driven censorship' future if UK Online Safety Bill gets through Parliament
- Roger Waters tells Facebook CEO to Zuck off after 'huge' song rights request
YouTube has grappled with recommendation systems for years and has adjusted them to improve performance. Still, its automated software isn’t perfect: videos that violate its content policies still slip through and continue to be recommended to users. Mozilla reckons that will only continue if YouTube remains coy on how its algorithm actually works.
“We believe our research has revealed is only the tip of the iceberg, and that each of these findings deserves and requires further scrutiny,” the report stated.
“We also recognize that without intervention to enable greater scrutiny of YouTube’salgorithms, these problems will continue to go unchecked and the consequences on our communities will build. Despite the progress that YouTube claims to have made on these issues, it is still nearly impossible for researchers to verify these claims, nor study YouTube’s recommendation algorithms.”
Mozilla recommended YouTube publish data on how its systems work and be transparent about its content moderation and recommendation model. That way, Moz argued, researchers can independently audit the AI software. ®