This article is more than 1 year old

YouTube's radicalizing Alt-right trolls and Facebook's recruiting new language boffins

It's all the AI news you might have missed

Roundup Hello, here's a quick roundup of news from the world of machine learning.

Facebook has started an NLP research group: Facebook’s AI research group has launched a natural language research consortium to partner up with other boffins focused on areas like machine translation and sentiment analysis. Members in the group will receive funding and get to work with Facebook researchers on long multiyear projects.

“Facebook believes strongly in open science, and we hope the consortium, as well as these research awards in NLP and machine translation, will help accelerate research in the NLP community,” it said this week.

It also announced a list of research proposals that will receive funding. They were split into three groups focused on making NLP models more efficient, machine translation for uncommon languages, and robust deep learning.

YouTube and radicalization: There are numerous case studies of YouTube algorithms contributing to online radicalization by promoting far-right videos and conspiracy theories.

The internet is a cesspit of loathing, for the world and oneself. Content platforms like YouTube only amplify that, according to a team of researchers who have attempted to study the phenomenon.

“Non-profits and the media claim there is a radicalization pipeline on YouTube. Its content creators would sponsor fringe ideas, and its recommender system would steer users towards edgier content. Yet, the supporting evidence for this claim is mostly anecdotal, and there are no proper measurements of the influence of YouTube’s recommender system,” they wrote in the paper’s abstract.

The researchers decided to analyze 331,849 videos across 360 channels that are associated with Alt-right, “Alt-lite,” (a gentler version of Alt-right ideology) and the so-called Intellectual Dark Web.

They also looked at more than 79 million comments left on the videos and found that those three online communities were more likely to share the same user base. Also, these YouTube consumers started off watching lighter – say Alt-lite videos – and at first and eventually landed themselves in the more aggressive Alt-right territory.

“The three communities studied sky-rocketed in terms of views, likes, videos published and comments, particularly, since 2015, coinciding with the turbulent presidential election of that year,” the researchers eerily concluded.

It’s not the recommended videos that were the worst offender, however, its YouTube’s recommended channels feature. Consumers who started off with videos in channels classified under Alt-lite or the Intellectual Dark Web were eventually led down to the Alt-right path over time.

If that's not convincing enough, you can read the paper on ArXiv here. ®

More about

TIP US OFF

Send us news


Other stories you might like