This article is more than 1 year old

It's time to reveal all recommendation algorithms – by law if necessary

We should know why we see what we see, not be left in the dark

Column As it’s been about forty years since I’ve had pimples, it astounds me that YouTube’s recommendation engine recently served me videos of people with some really severe skin problems - generally on their noses. The preview images themselves are horrifying, and should really come with some sort of content warning. I immediately tell YouTube “I don’t want this” and “never recommend this channel.”

Yet it persists.

How? Why?

I’ve never watched a dermatology video in all of my years using YouTube, nor do I or anyone I know have an adjacent skin condition that might at some point suggest to the algorithm that I needed to be exposed to these truly can-not-be-unseen video images.

The essential nature of a recommendation algorithm is that it’s doing its best to anticipate your desires from whatever bits of data it can gather about you.

I defend myself from arbitrary data collection that fuels the algorithms using PiHole, the tracker-blocking Disconnect plugin, and Firefox, plus a few other tricks. In theory recommendation algorithms therefore have less to work with than if I simply journaled my every activity via some sort of oh-so-friendly-and-rapacious Android smartphone.

I make an active effort to resist data collection - and perhaps these horrors are one consequence.

At best that’s a guess, because I have no way to know what goes on at the heart of YouTube’s recommendation algorithm. If ever exposed, that closely guarded algorithm could be gamed - in a manner similar to the way so many marketing bottom feeders continually test and game search engine results.

There’s a profound commercial disincentive for YouTube to become transparent about how its algorithm works.

That leaves me and other privacy conscious folk with just one lever to pull offer - disliking a video and a channel and hoping that - somehow - the algorithm might be able to intuit a generalized case from a specific instance.

In the one example where we’ve been exposed to the inner workings of a recommendation engine - Twitter went public with theirs at the end of March - we got an eyeful of a different kind of horror: an algorithm that promoted owner Elon Musk’s tweets above all others, promoted specific political interests - and that specifically will not promote tweets directly pointing to LGBTIQ+ words, concerns, or themes.

While we can all have a nice laugh at Elon’s profoundly public narcissism, the silencing of an entire community at a moment in history when forces around the world seek to roll back recent gains in civil rights for LGBTIQ+ individuals is not funny. Where it becomes harder to share one’s own story, that story can be framed as marginal, unimportant - even dangerous. It’s a profound ‘othering’ that can, thanks to an algorithm, be greatly amplified.

The solution to both issues is obvious, technically easy, and yet commercially a nearly impossible proposition: open up all recommendation algorithms.

Make them completely transparent, and, for the individual being targeted by the recommendation engine, completely programmable.

I should not only be able to interrogate how I got a horrifying video of a very bad case of pimples, I should be able to get in there and tune things so that the algorithm no longer needs to guess my needs, because I have had the opportunity to make those needs clear.

Every algorithm that recommends things to us - music or movies or podcasts or stories or news reports - should be completely visible. There must be nothing secret behind the scenes, because we know now from countless examples - the biggest and ugliest being Cambridge Analytica - how recommendations can be used to drive us to extremes of belief, emotion – even action.

That’s too much power to leave with an algorithm, and too much control to cede to those who tend those algorithms.

If recommendation algos aren’t shared then we need - by legislation, if necessary - a switch that turns the recommendation engine off.

That might leave us floating in a vast and unknowable sea of content, but it’s better to know you’re nowhere than to be led down a garden path. ®

More about

TIP US OFF

Send us news


Other stories you might like