This article is more than 1 year old
Twitter: Our AI image-cropping algorithm is biased toward White people, women
And that's why we've let humans take back control
Twitter said its AI-powered image-cropping algorithm is slightly biased in favor of White people and women after all, and has taken steps to ditch its reliance on the machine-learning code.
The algorithm was used to automatically decide which portion of a photo shared on the social network was best for the screen viewing it. You can upload pictures in all sorts of sizes and aspect ratios; at some point the images have to be displayed, and Twitter used a trained system to decide which parts of the media to show. The option to view the whole image was available – the algorithm was used to create a preview of the media.
Specifically, the software relied on a saliency algorithm designed to keep the most interesting and relevant parts of the image in the preview crop. For example, a snap taken of people at the beach should focus on their faces rather than the sky or sand.
Twitter users, however, spotted that the tool, when given a choice of things to pick, seemed to zero in on people with lighter skin and women’s chests. After the social network investigated the issue in October, it told The Register it didn’t find any evidence of racial or gender biases, though admitted its engineers probably needed to conduct more tests.
Now, three of its techies in its Machine Learning Ethics, Transparency, and Accountability team and its Content Understanding Research team have discovered that the saliency algorithm is, in fact, skewed towards White people and women.
We're told the software is about eight per cent more skewed towards displaying women compared to men, and four per cent more for people with lighter skin compared to those with darker skin. When these groups were narrowed down by gender, there was a seven per cent difference in favor of White women compared to Black women, and two per cent for White men compared to black men.
The difference seems slight but with millions upon millions of pictures tweeted a day, a good number of people are going to encounter the bias.
The team conducted their tests by comparing the way the software handled images of Black females, Black males, White females, and White males, running the experiment 10,000 times. In the previous study, staff looked at how the code cropped White people versus Black people, White versus Indian, White versus Asian, and male versus female, running the tests just 200 times. The small sample size made it more difficult to detect any significant levels of bias.
- Researchers say objects can hide from computer vision by seeking out unusual company that trips correlation bias
- Facebook job ads algorithm still discriminates on gender, LinkedIn not so much
- AI brain drain to Google and pals threatens public sector's ability to moderate machine-learning bias
- There's nothing AI and automation can't solve – except bias and inequality in the workplace, says report
There seemed to be no clear evidence that the algorithm focused on parts of people’s bodies other than their faces, however.
“We found that no more than 3 out of 100 images per gender have the crop not on the head,” they explained in a paper shared on arXiv this week. “The crops not on heads were due to high predicted salient scores on parts of the image such as a number on the jersey of sports players or a badge. These patterns were consistent across genders.”
It’s possible that the problematic crops of women’s chests were caused by logos or images on their clothes, which grabbed the attention of the code.
Since March, Twitter has gradually rolled out an update for smartphone users so that its app doesn’t rely on its saliency algorithm, and instead gave people more control of how images are displayed. In short, Twitter now displays "standard aspect ratio photos in full on iOS and Android — meaning without the saliency algorithm crop," and "a true preview of the image in the tweet composer field, so tweet authors know how their tweets will look before they publish."
“We considered the trade-offs between the speed and consistency of automated cropping with the potential risks we saw in this research,” Rumman Chowdhury, director of software engineering at Twitter, said on Wednesday.
“One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people.”
Twitter said it’s still working to improve its image-cropping techniques for photos displayed on its website and for multiple photos. ®