Meta fined record-breaking $24.6m for deliberately ignoring political ad law
Pocket change for troubled Facebook giant, plus more US election news
Despite warnings of Chinese and Russian mischief and manipulation ahead of the US midterm elections, it seems American companies and citizens are perfectly capable of denting democracy on their own.
A Washington judge fined Meta $24.6 million this week after ruling that Facebook intentionally broke [PDF] the state's campaign finance transparency laws 822 times. This fine was the maximum amount, we're told, and represents the largest-ever penalty of its kind in the US.
To put the fine in perspective: it's about half a day of Meta's quarterly profits, which in these uncertain economic times dropped to $4.4 billion for Q3 this year.
In addition to paying the pocket change, Meta was ordered [PDF] by the judge to reimburse the Washington state attorney general's costs, and noted these fees should be tripled "as punitive damages for Meta's intentional violations of state law."
While the exact amount hasn't been determined, Attorney General Bob Ferguson said that legal bill totals $10.5 million for Facebook's "arrogance." Again, pocket change.
"It intentionally disregarded Washington's election transparency laws. But that wasn't enough," Ferguson said. "Facebook argued in court that those laws should be declared unconstitutional. That's breathtaking."
The state requires internet outfits like Meta that display political ads on their websites and in their apps to keep records on these campaigns and make these details publicly available. This includes the cost of the advert and who paid for it along with information on which users were targeted and how far the ads reached.
Meta, which at the time was known as Facebook, repeatedly failed to do this, denying netizens details of who was pushing political ads on them. Specifically, the tech giant did not "maintain and make available for public inspection books of account and related materials" regarding the political ads, according to court documents [PDF] filed in 2020.
Washington state first sued Facebook over the missing political advertising records in 2018. That same year, Facebook launched an online searchable library of all the adverts it was running, along with how much they each cost, their reach, and other data, in an attempt to be more transparent about how its advertising system worked.
However, it was said that this library did not satisfy Washington's ad disclosure requirements, so Facebook banned all political ads in the state in hope of sidestepping the rules. Ferguson's office took Meta to court again in 2020 after it turned out Meta was still taking political ads in Washington.
Problems with political ads, however, extends beyond Washington state, according to a report published this week.
So-called "pink-slime newsrooms" — hyper-partisan publications that are dressed up as independent regional media — are spending millions of dollars on Facebook and Instagram ad campaigns in battleground states in the lead-up to America's November midterm elections, a NewsGuard Misinformation Monitor found. These ads either push netizens to obviously left or right-leaning articles, or are snippets of articles contained within the ad.
- Pro-China crew ramps up disinfo ahead of US midterms. Not that anyone's falling for it
- US election workers slammed with phishing, malware-stuffed emails
- Foreign spies hijacking US mid-terms? FBI, CISA are cool as cucumbers about it
- As Russia wages disinfo war, Ukraine's cyber chief calls for global anti-fake news fight
Four of these outlets, some backed by Republican and others Democratic donors, have collectively spent $3.94 million on ad campaigns running simultaneously on Meta's platforms so far in 2022, according to an investigation by the media trust org. The ad content or the articles they link to are at best highly partisan, and at worse play fast and loose with the truth to push a point. The goal, it seems, is to get people fired up enough to vote for one particular side, while appearing to be published by a normal media operation rather than a political campaign.
"Facebook and Instagram have been pivotal in these groups' strategies," according to NewsGuard, which says pink-slime newsrooms use Meta's "low costs, hypertargeting tools, and porous policies related to political ad spending to target voters in battleground states while underplaying or entirely hiding their partisan-driven agendas and financing."
Their strategy seems to work, too. One of the publishers, Courier Newsroom, in an August 2022 case study, touted spending $49,000 on Facebook ads targeting 12 Iowa counties ahead of the state's June 2022 primary election. The political spending resulted in 3,300 more votes, which NewsGuard suggested likely went to Democrats.
Misinformation turns violent
While lies and made-up stuff on social media platforms can polarize people, upset discourse, and cast doubt on voting infrastructure and democracy as a whole, there's also the problem of threats and physical violence. No one can forget the January 6, 2021 storming of the Capitol building by raging Trump supporters and other hard-right insurrectionists, bent on overturning a fair presidential election.
A study published this week, based on more than 100,000 Twitter posts from the 2020 Congressional elections, found women of color candidates twice as likely as other candidates to be the subject of mis- and disinformation. They were also more than four times as likely as white candidates and twice as likely as men of color candidates to be targeted with violent abuse, according to the Center for Democracy and Technology's research [PDF].
"Identity-based online [gender-based violence] targeted at women of color candidates focused on the transgressiveness of running for office (i.e. a woman seeking power, as someone presumed unworthy or unsuited for power or authority)," the CDT authors wrote.
It's worth noting that just 10 percent of candidates running for Congressional offices in 2020 were women of color.
Based on their findings, the CDT recommends social media platforms do a better job of publicizing their policies that prohibit online abuse, invest more resources into enforcing these policies, and publish transparency reports on misinformation and abuse, especially in the run up to elections.
Tech companies should also "scrutinize the role of political advertising in spreading mis- and disinformation and abuse on their services," the authors wrote. ®