This article is more than 1 year old

Meta agrees to tweak ad system after US govt brands it discriminatory

And pay the tiniest of fines, too

Facebook parent Meta has settled a complaint brought by the US government, which alleged the internet giant's machine-learning algorithms broke the law by blocking certain users from seeing online real-estate adverts based on their nationality, race, religion, sex, and marital status.

Specifically, Meta violated America's Fair Housing Act, which protects people looking to buy or rent properties from discrimination, it was claimed; it is illegal for homeowners to refuse to sell or rent their houses or advertise homes to specific demographics, and to evict tenants based on their demographics.

This week, prosecutors sued Meta in New York City, alleging the mega-corp's algorithms discriminated against users on Facebook by unfairly targeting people with housing ads based on their "race, color, religion, sex, disability, familial status, and national origin."

Meta agreed to settle the case, and promised to pay a $115,054 fine to end the matter. Crucially, it also agreed to tweak its ad targeting system. The US government can't issue a heftier sanction, since it's the maximum penalty fee for violating the FHA, and we suspect the primary aim was to force a change in the software.

"When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the FHA, just as when companies engage in discriminatory advertising using more traditional advertising methods," Damian Williams, US Attorney from the Southern District Court of New York, said in a statement.

"As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner," the Department of Justice's Assistant Attorney General Kristen Clarke from the Civil Rights Division added. "This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit."

Meta allowed advertisers to target specific users through its "Lookalike Audience" or "Special Ad Audience" tool, it was alleged. The software uses machine learning to group users from similar backgrounds so advertisers can deliver adverts to people belonging to the same race, religion, or sex, and block those that aren't in specific groups from seeing them. Which would be a no, no.

The Special Ad Audience system was implemented as an alternative to the Lookalike Audience tool. Meta pledged to discontinue its Special Ad Audience system for housing-related adverts as well as other riskier categories, such as employment and credit ads, as part of the deal with Uncle Sam.

Advertisers will have to comply with the company's non-discriminaion policies, and the ability to target specific users on social media will be restricted. For example, Meta has banned gender and age targeting for housing, employment and credit-related adverts. Location targeting will be limited to within a 15-mile radius. 

"We're making this change in part to address feedback we've heard from civil rights groups, policymakers and regulators about how our ad system delivers certain categories of personalized ads, especially when it comes to fairness…Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others," Meta said in a statement. ® 

More about

TIP US OFF

Send us news


Other stories you might like