This article is more than 1 year old
Facebook ad platform discriminates all on its own, say boffins
Bigots, racists and haters rejoice! You are going to get bias out of the box
Facebook has been taking a lot of stick over discrimination on its platform but a new paper suggests that the problems with the platform could go deeper.
After three years of criticism that its ad system allows advertisers to unlawfully discriminate, Facebook last month announced changes to its ad platform intended to prevent advertisers from deploying unfair credit, employment and housing ads. A week later, the US Department of Housing and Urban Development sued the social ad biz for violating the Fair Housing Act.
However, research just published through pre-print server ArXiv suggests preventing advertisers from distributing discriminatory ads is only part of the challenge for those favoring equity; Facebook also needs to examine the bias baked into its ad slinging infrastructure.
According to boffins from Northeastern University, the University of Southern California, and tech accountability non-profit Upturn, Facebook's ad delivery system itself can steer ads intended to be inclusive toward discrimination without explicit intent.
In a paper titled, "Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes," co-authors Muhammad Ali, Piotr Sapiezynski, Miranda Bogen, Aleksandra Korolova, Alan Mislove, and Aaron Rieke find that advertiser budgets and ad content affect ad delivery, skewing it along gender and racial lines even when neutral ad targeting settings are used.
The researchers found that Facebook ads tend to be shown to men because women tend to click on ads more often, making them more expensive to reach through Facebook's system. That divide becomes apparent when ad budgets are compared, because the ad budget affects ad distribution. As the paper explains, "the higher the daily budget, the smaller the fraction of men in the audience."
Such segregation may be appropriate and desirable for certain types of marketing pitches, but when applied to credit, employment and housing ads, the consequences can be problematic.
The power of images
Ad content – text and images – also has a strong effect on whether ads get shown to men or women, even when the bidding strategy is the same and gender-agnostic targeting is used.
In particular, the researchers found images had a surprisingly large effect on ad delivery. Ad URL destination has some effect – an ad pointing to a bodybuilding site and an ad pointing to a cosmetics site had a baseline delivery distribution of 48 per cent men and 40 per cent men respectively. The addition of a title and headline doesn't change that much.
But once the researchers added an image to the ad, the distribution pattern changed, with the bodybuilding site ad reaching an audience that was 75 per cent male and the cosmetics ad reaching an audience that was 90 per cent female.
According to the researchers, their tests suggest, "Facebook has an automated image classification mechanism in place that is used to steer different ads towards different subsets of the user population."
(Facebook's marketing API uses "male," "female," or "unknown" as valid gender values; the researchers chose to ignore the "unknown" category for this study.)
In terms of credit, employment and housing ads, the problem with this system is that it discriminates where it shouldn't: Five ads for lumber industry jobs were delivered to an audience that was more than 90 per cent men and more than 70 per cent white; five ads for janitorial work were delivered to an audience that was more than 65 per cent women and 75 per cent black. Housing ads also showed a racial skew.
As the researchers point out, these ads used the same bidding strategy and neutral audience targeting parameters (set by the advertiser), and they all ran at the same time. The only difference was the destination link and ad creative.
Our amazing industry-leading AI was too dumb to detect the New Zealand massacre live vid, Facebook shrugs
READ MORE"Our findings underscore the need for policymakers and platforms to carefully consider the role of the optimizations run by the platforms themselves – and not just the targeting choices of advertisers – in seeking to prevent discrimination in digital advertising," the researchers conclude.
Commenting on the findings via Twitter, Alex Stamos, former chief security officer at Facebook and current Stanford University lecturer, said, "This is almost the perfect case study for [machine learning] picking up on existing biases and amplifying them. You can blind the algorithm to sensitive aspects of users but it will find proxies. I'm not sure this can be solved except by having no algorithmic optimization of certain ad classes."
Facebook insists it is looking at ways to improve its ad system. "We stand against discrimination in any form," said Joe Osborne, a Facebook spokesperson, in an email to The Register.
"We’ve announced important changes to our ad targeting tools and know that this is only a first step. We’ve been looking at our ad delivery system and have engaged industry leaders, academics, and civil rights experts on this very topic – and we're exploring more changes." ®