This article is more than 1 year old
Meta disbands Responsible Innovation team, spreads it out over Facebook and co
Still unclear: Were members just screaming into a void for the past few years?
Facebook parent Meta has disbanded its Responsible Innovation Team (RIT) that it claimed last year was a central part of efforts to "proactively surface and address potential harms to society in all that we build."
A Meta spokesperson confirmed the change to the WSJ yesterday. The RIT previously included two dozen engineers, ethicists and other Meta employees who were responsible for identifying and addressing concerns with products and updates to Facebook and Instagram.
Meta spokesman Eric Porterfield told The Register that, rather than ending the efforts of the RIT, the disbanding will see "the vast majority" of the 20-person team moved into other areas at Meta "to help us scale our efforts by deploying dedicated experts directly into product areas, rather than as a standalone team."
Per Porterfield, Meta's official statement on the matter is that the work done by the RIT is more of a priority now - not less, as a disbanding of the team would suggest.
How effective was the RIT?
The company’s Margaret Gould Stewart, who was most recently at the reins of the Responsible Innovation Team, wrote in a blog post last year that the RIT was akin to a doctor serving as a first-line generalist, referring issues to specialists as needed.
"In the product design context, this means thinking not just short- to mid-term, but investing time to forecast what longer term impacts might be," Stewart said.
News stories about Meta, Facebook and Instagram over the past few years have largely involved controversy over the company's choices, however, which suggests the RIT may not have had much influence in final business decisions at Meta.
In the past few years, the Zuckerberg-helmed company has been involved in plenty of questionable behavior, including the recently settled Cambridge Analytica scandal, revelations that the company has repeatedly violated Washington state political ad laws, a $400 million penalty in Ireland for violating the privacy of children, code injection to circumvent Apple's App Tracking Transparency, massive losses trying to prop up Zuckerberg's vision of the metaverse and more.
Those scandals raise an obvious question: Was the Responsible Innovation Team able to accomplish anything?
Per the WSJ, previous RIT leader Zvika Krieger said the team was involved in Facebook's decision to exclude a race filter in dating profiles, a feature he said was later picked up by other dating apps. Stewart said that during her tenure, the RIT was involved in planning Meta's COVID-19 related products.
"We wanted teams to consider things like combating misinformation about the virus, whether a tool could be exploited by profiteers, or whether a feature could be unintentionally offensive or insensitive," Stewart said.
Stewart's aforementioned blog post, published in June 2021 and including the COVID-19 statement, came just months before the company owned up to another snafu.
- Facebook ditches its creepy, controversial robot – yes, its facial-recognition AI
- Meta offers $37.5m to settle location tracking lawsuit
- Education officials urged to curb student snoopware
- Meta accuses data scrapers of taking more than their share
- Meta agrees to tweak ad system after US govt brands it discriminatory
In August of 2021, Facebook released a report on the most viewed content of the past quarter (Q2 2021). The New York Times called the company out, saying it had seen a report from the first quarter of the year, which Facebook quickly published.
The report highlighted that Facebook's most-shared link was to a story that, while accurate, misattributed a Florida doctor's death two weeks after receiving the COVID-19 vaccine to the injection despite no direct link.
Facebook's secret Q1 report stated the story had been shared over 53 million times.
At the time, Facebook Policy Communications Director Andy Stone gave a standard Meta response: We're sorry and we'll do better in the future.
If only Facebook had a team in place last year that could have nipped that bad decision in the bud. ®