Whistleblower Frances Haugen today urged Congress to regulate Facebook and its algorithms that she said put immense profit before safety and society.
“I am here today because I believe that Facebook’s products harm children, stoke division, weaken our democracy, and much more,” Haugen, a former program manager at Facebook, said [PDF] at a Senate commerce and science subcommittee hearing.
“The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They cannot solve this crisis without your help.”
Yeah, we know: corporation puts profit first, who'd have thought? Bear in mind, though, Facebook is used by 1.9 billion people every day, and made a $29.1bn profit in 2020, up 58 per cent on the year before. That's some responsibility on its shoulders.
A string of internal documents, leaked to the Wall Street Journal and dubbed The Facebook Files, revealed how exactly the social media giant prioritizes growth and engagement over its users' well-being. Haugen outed herself on Sunday as the source of those documents, saying she had "tens of thousands of pages of Facebook internal research" to share, and added: "I have to get out enough that no one can question that this is real."
In addition to her testimony to US lawmakers this week, Haugen, via Whistleblower Aid, has also filed eight complaints with America's financial watchdog. These included claims Facebook misled people about how its platform was used in the January 6 insurrection attempt [PDF], how it provided a moderation-free service exclusively to high-paying or celebrity users [PDF], and how old its users are [PDF]. It's said its audience of US teens and young people is shrinking.
"The company has also hidden the extent to which content production per user has been in long-term decline," states one of her complaints.
Haugen, who previously worked for Google, Pinterest and Yelp in product and software roles, joined Facebook in 2019 and was on its then Civic Integrity team, filtering out political misinformation, and later in counter espionage for the biz. She said immediately after the 2020 US presidential election the unit was dispersed to work on other areas and attention on the issue of political misinformation dropped off. She left the tech mega-corp in May this year.
"The documents I have provided," Haugen told the hearing, "prove that Facebook has repeatedly mislead the public about what its own research reveals about the safety of children, the advocacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages. I came forward because I believe that every human being deserves the dignity of the truth."
Facebook is not your friend
Teenagers, particularly girls, are bombarded on Facebook's empire with images of airbrushed models and celebrities. The pressure to look picture-perfect on Instagram caused some to develop body dysmorphia, it's reported, driving them to eating disorders and anorexia.
Data collected from focus groups and online surveys between 2019 and 2021 by Facebook indicated to executives that Instagram simply made some teen girls feel worse about themselves, it is claimed. Meanwhile, the Silicon Valley titan continued to deploy algorithms to keep those users addicted to scrolling, and continued to figure out ways to attract even younger and impressionable kids to its social media platforms knowing full well the harm it was potentially doing to some.
Instagram doesn’t allow people under the age of 13 to sign-up for an account. Senator Marsha Blackburn (R-TN) said Facebook’s global head of safety Antigone Davis, who testified at a Senate subcommittee hearing last Thursday, estimated about 600,000 Instagram accounts made by users under 13 were removed last year.
Haugen said Facebook saw this as an opportunity to expand its reach: it aimed to launch new services, such as Instagram Kids, for preteens denied normal accounts. Bosses said they were "pausing" plans for Instagram Kids last month, which may be related to 44 attorneys general – Republican and Democrat – expressing [PDF] serious concerns about the service.
When Haugen was asked point-blank whether Facebook ought to be regulated, she replied: “Until incentives change at Facebook, we should not expect Facebook to change. We need change from Congress.”
Amazingly, Facebook disagrees
In response to the hearing, Facebook said it totally wanted to be governed by internet laws passed by Congress, the very body it so enjoys lobbying behind the scenes to get its own way. The biz also attempted to downplay Haugen's knowledge of the social network's operations, and said she had been employed at the company for not that long – and who could blame her? – and that she had mischaracterized its internal data.
"Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives — and testified more than six times to not working on the subject matter in question," it said.
"We don’t agree with her characterization of the many issues she testified about. Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”
- We have some sad news about Facebook. It has returned to the internet after six-hour mega outage
- Facebook, WhatsApp, Instagram deplatform themselves: Services down globally
- Tobacco giants don't get to decide who does research on smoking. Why does Facebook get to dictate studies?
- Texas law banning platforms from social media moderation challenged in lawsuit
Facebook has also come under fire for failing to eradicate COVID-19 vaccine misinformation from its realm of apps and websites. Anti-vax content at least was rampant on Facebook and promoted by its newsfeed algorithms.
“Since the beginning of the pandemic across our entire platform, we have removed over 3,000 accounts, Pages and groups for repeatedly violating our rules against spreading COVID-19 and vaccine misinformation and removed more than 20 million pieces of content for breaking these rules,” Facebook said in an earlier blog post.
To rein in the harmful effects of Facebook’s content algorithms, Senator John Thune (R-SD) suggested legislation to reform Section 230 of the Communications Decency Act. That section states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider," meaning Facebook cannot typically be held liable for toxic or harmful content posted on its platforms.
“We need to hold Big Tech accountable by reforming Section 230," Senator Thune said. "One of the best opportunities to do that in a bipartisan way is the Platform Accountability Consumer Transparency Act. That, in addition to stripping Section 230 protections for content that is illegal, it would also increase transparency and due process for users.” ®
An NPR article – full disclosure: it was written by a journalist married to a Facebook staffer – has argued the leaked Facebook research on the harm done by Instagram to its teen users is not as conclusive as it may seem. The main point is that the sample size of the surveys may be too small to be useful. Rather than letting Instagram off the hook, this only means more substantial studies are needed.