This article is more than 1 year old

YouTube loves recommending conservative vids regardless of your beliefs

There goes Woke Big Tech again, downplaying traditional liberal views

YouTube's recommendation algorithm not only gently traps viewers in mild echo chambers, it is more likely to suggest conservative-leaning videos regardless of your political alignment.

That's according to a study out of New York University's Center for Social Media and Politics (CSMP) that was highlighted this month by Brookings.

Social networks live and die by their recommendation algorithms: they're designed to keep visitors on the sites and apps by feeding them content that keeps them hooked, and if that drives up engagement – sharing links with others, commenting, subscribing, upvoting, etc – all the better. Controversial or mind-blowing material is perfect for this, as it's likely to spark more shares and comments, and thus keep more people addicted.

These feeds are thus personalized to each netizen based on their interests; that's essential to drawing and locking them in. Echo chambers form where people post and share stuff they have a common interest in, and ignore anything contrary to those principles and attitudes. This leads to a positive feedback loop of engagement within the group, reinforcing beliefs and increase time – and ad views – within the app or website.

Here are some questions. To what level do these recommendation algorithms fuel these echo chambers? Do the algorithms push users down increasingly narrow ideological pathways, radicalizing netizens as they go deeper into an online rabbit hole? Can software turn people to extremism? Is there a systemic bias in the recommendations?

To tackle this subject, a team of researchers at CSMP studied the effects of YouTube recommendations by getting 1,063 adult Americans, recruited via Facebook ads, to install a browser extension that kept tabs on their experience browsing the Google-owned website.

It does push users into increasingly narrow ideological ranges of content in what we might call evidence of a (very) mild ideological echo chamber

The participants were asked to select one starting YouTube video out of 25, made up of a mixture of political and non-political stuff, and then follow a set path through the videos YouTube recommended they watch next.

The netizens were told to always click on either the first, second, third, fourth, or fifth recommendation each time; the recommendation slot was chosen at random per person from the start. This traversal through YouTube's suggested videos was repeated 20 times by each participant, over a period from October to December 2020.

The extension logged which videos YouTube recommended at every stage, and thus the videos that were watched. The team scored the ideological view of each vid, based on whether it was more conservative or liberal-leaning, to measure the effect of echo chambers and any latent bias in the system, and to see if viewers were being recommended increasingly more extreme content.

How that ideological score was determined is kinda crucial to this, so we'll trust for a moment that it was robust. The participants were also surveyed on their demographics.

"We found that YouTube's recommendation algorithm does not lead the vast majority of users down extremist rabbit holes, although it does push users into increasingly narrow ideological ranges of content in what we might call evidence of a (very) mild ideological echo chamber," the academics disclosed in a report for the Brookings Institution.

"We also find that, on average, the YouTube recommendation algorithm pulls users slightly to the right of the political spectrum, which we believe is a novel finding."

The algorithm pulls users slightly to the right of the political spectrum

The abstract of their paper makes clear that this bump to the right happens "regardless of the ideology" of YouTube viewers.

The study found that users were nudged into watching more right or left-wing media, depending on their starting point. YouTube recommendations would appear to thus diverge slightly off to the right for conservatives and left for progressives. This change in ideological strength of the recommendations started small and increased the longer the viewer followed the algorithm's recommendations. In other words, if you (for example) watched what was deemed to be moderately left material, over time your recommendations will drift more left, but only very slightly and gradually, according to this study.

Mild ideological echo chambers thus exist on YouTube, the researchers argued. That would make sense for YouTube, as it ensures viewers stay engaged and remain glued to the site. Notably, there seems to be little to no evidence that viewers are put on a fast track to more ideologically extreme content.

That may be at odds with your own experience, as sometimes you can see how people are pulled down into rabbit holes. One starts off with a clip of a harmless talk show host, next they are steered to watch a bit by a moderate comedian, then they find themselves watching a conversation from an alternative podcaster, then they're getting life advice from a questionable academic, then it's fringe principles by another influencer, and before long, they're watching someone skirting YouTube's rules on hate speech. Or maybe the study's right, and it isn't the recommendation engine driving people down steep slopes.

What is more interesting, perhaps, is that YouTube seems to overall lean toward recommending moderately conservative content to users regardless of their political orientation, at least according to the NYU center.

The team isn't quite sure why. "Although our data allow us to isolate the role played by the recommendation algorithm, we are unable to peer inside the black box. Without this clarity, we can't determine whether the algorithm operates more forcefully for conservatives because they are more demanding of ideologically congruent content than liberals, or for some other reason," they wrote in the paper.

It may be that right-wingers are more likely to consistently watch right-wing content and left-wingers are more likely to view more diverse content – stuff they agree with and things they rage-share – or that there are simply more conservative videos than liberal ones on YouTube. YouTube recommendations may be pulled to the right to reflect the video library, satisfy audience demand, and drive up that ever-desired engagement.

A similar effect has also been observed on Twitter. Its recommendation algorithm tends to boost posts from right-wing politicians and news publications more than left-wing ones. Political scientists from the same team at NYU previously commented that it may be because conservative content is more likely to generate more outrage and, therefore, lead to more engagement.

We've also, by the way, noted before that it's a conservative talking-point that right-wing views are routinely unfairly censored or hidden away on the internet by Big Tech. That may be because conservative political communication gets seen and flagged as misinformation more often than the opposition messaging.

No one at YouTube or Google was able to respond to the CSMP study's findings. ®

More about

TIP US OFF

Send us news


Other stories you might like