This article is more than 1 year old

Facebook exec extracts foot from mouth: We didn't really mean growth matters more than human life

Damage control needed for damage control

Facebook held a press conference on Thursday to provide details about its efforts to prevent electoral manipulation, only to have its damage control eclipsed by the publication of an executive's internal memo from 2016 suggesting growth mattered more than human life.

Acknowledging that Facebook had been used "to divide Americans, and to spread fear, uncertainty and doubt," Guy Rosen, VP of product management, insisted that Facebook takes electoral interference seriously.

"Now, none of us can turn back the clock, but we are all responsible for making sure the same kind of attack our democracy does not happen again," he said.

Rosen's reassurance coincided with Buzzfeed's publication of a leaked memo composed by Facebook VP Andrew Bosworth in 2016 that attempted to justify a policy of growth – connecting people – at any cost, even at the cost of human life.

"Maybe [connecting people] costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools," the memo says. "...The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good."

Bosworth and CEO Mark Zuckerberg, still scrambling to put out the Cambridge Analytica fire, both repudiated the memo.

Bosworth said he was just trying to be provocative. "I don't agree with the post today and I didn't agree with it even when I wrote it," he said, a disavowal that invites doubts about the reliability of Facebook's current statements.

Expect the memo to come up if Zuckerberg, as rumored, ends up testifying before Congress in a few weeks. But back to Facebook's election defense plans.

Listening to Stamos too late?

Rosen deferred to outgoing Facebook security chief Alex Stamos to outline Facebook's strategy for fighting disinformation.

Stamos glossed over the four major focuses of Facebook's efforts: fake identities, fake audiences, fake facts, and fake narratives.

"At the end of the day, we’re trying to develop a systematic and comprehensive approach to tackle these challenges, and then to map that approach to the needs of each country or election," said Stamos.

Facebook product manager Samidh Chakrabart said it blocks millions of fake account creation attempts every day and has begun being proactive, hunting for fake accounts instead of just waiting for reports from Facebook users.

The social media behemoth is also using a tool developed last year for the Alabama special Senate election to spot attempted election interference elsewhere in the world.

The message from Tessa Lyons, product manager on News Feed, was similar. She described how Facebook is partnering with fact-checking organizations and has begun fact checking photos and videos (in addition to links), initially in France with AFP and later elsewhere.

"We’re seeing progress in our ability to limit the spread of articles rated false by fact-checkers, and we’re scaling our efforts," said Lyons.

Rob Leathern, product management director for Facebook's ad team, described a new ad review process for US advertisers buying policial ads. Facebook Page admins will have to submit government IDs, addresses will be confirmed by postal mail, and advertisers will have to disclose the person or organization they represent. Such ads will also be labelled more clearly in Facebook and Instagram feeds.

Over the summer, Facebook intends to debut a public archive of all ads with a political label that will include metrics like cost and reach, said Leathern.

There. All fixed. ®

More about

TIP US OFF

Send us news


Other stories you might like