This article is more than 1 year old
This image-recognition roulette is all fun and games... until it labels you a rape suspect, divorcee, or a racial slur
If we could stop teaching AI insults, that would be great
Netizens are merrily slinging selfies and other photos at an online neural network to classify them... and the results aren’t pretty.
Aptly named ImageNet Roulette, the website accepts uploaded snaps, can fetch a pic from a given URL, or take a photo from your computer's webcam, and then runs the picture through a neural network trained using ImageNet, a massive database that links words to photos of things. The diversion emerged online this month.
The idea is that you show the site a face, and it will try to predict the label that would be assigned to the fizog, were it in the ImageNet collection. The software was specifically taught using pictures of people from the database, so it should basically classify folks with labels such as tennis player, or chef, or swimmer, depending on the scene.
Sometimes the captions emitted by the code are harmless. Sometimes they’re wrong. And sometimes they’re just downright offensive.
Tech journo T.C. Sottek found this out the hard way when the website reckoned he looked like a grass widower...
nothing like starting the week by having an artificial intelligence tell me the two qualities most associated with my face are “widower” and “divorced” pic.twitter.com/mdCCekuOrh
— tc (@chillmage) September 16, 2019
Sure, yes, that's funny, you may say. It gets worse.
A PhD student known as Saloni on Twitter fed the convolutional neural network two images. A snap of her wearing glasses caused the site to describe her as a myope, which is someone with nearsightedness. When she gave the software a picture of her without glasses, however, it reckoned she was a rape suspect.
Tried out the ImageNet prediction on myself and, great job, it figured out I was wearing glasses. So I tried it again with another picture where I wasn't wearing glasses and........ pic.twitter.com/gPywEApY2q
— Saloni 🏳️🌈 (@salonium) September 16, 2019
ImageNet is a popular data set containing millions of annotated images under categories defined by the WordNet collection. Just to stress: this data set is used widely in the AI world.
And yet, as ImageNet Roulette demonstrates, some of those labels include, unfortunately, insults and racial slurs. Julia Carrie Wong, a senior technology reporter for The Guardian, gave the neural network her selfie, and it described her using racist terms for people of Asian descent.
tfw when you get a press release about an AI photo thing that you've seen lots of other tech reporters having fun with but then it's actually not that fun pic.twitter.com/NMZNxlGNZW
— Julia Carrie Wong (@juliacarriew) September 17, 2019
You may think, well, this machine-learning system is fixated on people’s skin color. But no, look at this weird case. Brian Watson, a graduate archivist at the Kinsey Institute, the research center studying love and sex, was labelled as a black person when he is clearly white.
Uhhhhhhhhh pic.twitter.com/NvQF0tpD6P
— Brian M. Watson (@brimwats) September 15, 2019
It is almost as if it's designed to offend – and that appears to be the case. The software, developed by Leif Ryge, is part of an art exhibition arranged by Trevor Paglen and Kate Crawford, who wanted to highlight the pain that can be caused by code that is trained using biased or dodgy data.
“ImageNet contains a number of problematic, offensive and bizarre categories – all drawn from WordNet,” according to the ImageNet Roulette's masterminds.
“Some use misogynistic or racist terminology. Hence, the results ImageNet Roulette returns will also draw upon those categories. That is by design: we want to shed light on what happens when technical systems are trained on problematic training data. AI classifications of people are rarely made visible to the people being classified. ImageNet Roulette provides a glimpse into that process – and to show the ways things can go wrong.”
Thanks for reading, you red and blue striped golfing umbrellas. ®