Researchers defend Facebook emoto-furtling experiment

'All REAL men ignore consent and privacy'

Facebook's “creepy” feed-manipulation experimentation, which has generated an avalanche of outrage among users, isn't without its chums. A growing collection of psychologists and tech pundits is linking arms, standing next to Mark Zuckerburg, and singing “We Shall Overcome” in the direction of mobs carrying metaphorical pitchforks and flaming torches.

Facebook's own defence of its research has been so unconvincing that UK and Irish data watchdogs are now investigating the company.

Others, however, are wading in to defend The Social NetworkTM.

The tl;dr version? You're all wrong, quite possibly ignorant, routinely manipulated, and why should we let ethics get in the way of science? Apparently.

Let's start with this post by Tal Yarkoni, director of the Psychoinformatics Lab at the University of Texas, who is dismissive of “people … very upset at the revelation that Facebook would actively manipulate its users’ news feeds in a way that could potentially influence their emotions.”

His points are:

  • 1. It's okay, because the effect turned out to be tiny.

“The manipulation had a negligible real-world impact on users’ behaviour”, Yarkoni writes.

That can't have been known to the researchers, nor any hypothetical ethics committee, in advance of the experiment, and therefore is irrelevant to whether or not what Facebook did was ethical.

Moreover, Yarkoni is reporting on the aggregate of all results. Rules surrounding psych tests on humans are there not to protect numbers, but individuals. Demonstrating a small effect across hundreds of thousands of people does not show that none of the individuals were harmed.

  • 2. It's okay, because the result tells us about what users posted, not how they felt.

“The fact that users in the experimental conditions produced content with very slightly more positive or negative emotional content doesn’t mean that those users actually felt any differently.”

This is a neat sophistry: since we don't know if users were telling the truth about their feelings, in their posts, we don't know whether the suppression of good or bad news in their feeds actually changed their feelings.

  • 3. It's okay, because all communication on Facebook is manipulated in some way.

“Every single change Facebook makes to the site alters the user experience, since there simply isn’t any experience to be had on Facebook that isn’t entirely constructed by Facebook.”

This is more difficult to answer. It's true, but it's also incomplete.

Facebook's “user engagement” manipulation is designed to make people want to use Facebook more often (and to get them clicking on more advertisements).

There's not much risk, for example, that re-weighting sponsored posts that pop up in a user's feed will make someone with clinical depression feel worse. Does Facebook know that its “emotional contagion” methodology carried no such risk?

  • 4. Human communication is manipulative

“Everybody you interact with–including every one of your friends, family, and colleagues–is constantly trying to manipulate your behaviour in various ways.”

Are all human interactions deliberately manipulative? Even if the answer is “yes”, interpersonal relations have things like visibility, trust, and consent which are all lacking in the Facebook experiment. There is simply no analogy between how partners in a relationship behave, and this experiment.

  • 5. Ends justify means

“If you were to construct a scale of possible motives for manipulating users’ behaviour – with the global betterment of society at one end, and something really bad at the other end – I submit that conducting basic scientific research would almost certainly be much closer to the former end than would the other standard motives we find on the web – like trying to get people to click on more ads.”

This may be perfectly true, but it's still a distraction from the ethics of the experiment.

Gartner: "Man up"

Yarkoni's analysis was then picked up by a Gartner research director, Martin Kihn.

We could suggest that Kihn's “Man up, people” is probably all you need to know about his response. To say that “Worried about academics following ethical standards of behaviour” equates with “lack of manliness” speaks volumes about tech sector culture, and nothing about research ethics.

“The study itself strikes me as being routine, legal, ethical and unsurprising,” Kihn writes – without supporting his assessment of its ethics whatever. Kuhn does reiterate two points from Yarkoni, the likely errors in the textual assessments and the smallness of the experiment's impact, before delivering this statement:

“If we start demanding an academic standard of 'informed consent' for routine A/B and multivariate tests run online, we’re skirting the boundaries of absurdity.”

It seems to have escaped Kihn that the research was published in an august academic publication, the Proceedings of the National Academy of Sciences, with researchers identified to universities as well as to Facebook, making the question of informed consent perfectly legitimate.

And El Reg can't help but wonder why informed consent is a concept that requires scare quotes.

Burying ethics in detail

Even a very serious discussion of the issue, one that picks over the relevant laws and regulations, appears to skirt it at the same time. This piece, by Michelle Meyer, lingers over the detail like it were a fine wine.

Since (as she explains) the Facebook study qualifies as “human subjects research”, Meyer views the question through the prism of legal requirements – was the study subject to the rules governing such research?

Beyond her discussion of methodology and results, Meyer makes the following key points:

  • 1. The research was “conducted and funded solely by an entity like Facebook”, meaning it “is not subject to the federal regulations”.
  • 2. The “involvement in the Facebook study by two academics nevertheless probably did not trigger Cornell’s and UCSF’s requirements” (for ethical review).
  • 3. The study may have passed ethical review if it had been submitted.

Meyer also restates the saw that emotional manipulation is common, citing the advertising industry as an example – and like Yarkoni, puts forward the apparently-disingenuous idea that “everyone does this so its ethical”.

Meyer does, however, make the worthwhile point that corporates can do this sort of stuff without the same constraints that apply to academics. She would like to see academic restrictions lifted; others may not agree.

Don't chill the science

Brian Keegan at Northeastern University has been fairly extensively cited for this piece.

Skipping the now-obligatory recap of the research methodology, let's get to what seems to be the meat of Keegan's argument: (a) “every A/B test is a psych experiment”, (b) nobody's discussing what informed consent should look like anyhow, (c) don't chill science: “All this manning of barricades strikes me as a grave over-reaction that could have calamitously chilling effects on several dimensions.”

A possible conclusion: perhaps Facebook's publication has done the world a favour by lifting the lid on the kinds of behaviours that psych researchers admire and aspire to. ®

Similar topics

Narrower topics

Other stories you might like

  • Experts: AI should be recognized as inventors in patent law
    Plus: Police release deepfake of murdered teen in cold case, and more

    In-brief Governments around the world should pass intellectual property laws that grant rights to AI systems, two academics at the University of New South Wales in Australia argued.

    Alexandra George, and Toby Walsh, professors of law and AI, respectively, believe failing to recognize machines as inventors could have long-lasting impacts on economies and societies. 

    "If courts and governments decide that AI-made inventions cannot be patented, the implications could be huge," they wrote in a comment article published in Nature. "Funders and businesses would be less incentivized to pursue useful research using AI inventors when a return on their investment could be limited. Society could miss out on the development of worthwhile and life-saving inventions."

    Continue reading
  • Declassified and released: More secret files on US govt's emergency doomsday powers
    Nuke incoming? Quick break out the plans for rationing, censorship, property seizures, and more

    More papers describing the orders and messages the US President can issue in the event of apocalyptic crises, such as a devastating nuclear attack, have been declassified and released for all to see.

    These government files are part of a larger collection of records that discuss the nature, reach, and use of secret Presidential Emergency Action Documents: these are executive orders, announcements, and statements to Congress that are all ready to sign and send out as soon as a doomsday scenario occurs. PEADs are supposed to give America's commander-in-chief immediate extraordinary powers to overcome extraordinary events.

    PEADs have never been declassified or revealed before. They remain hush-hush, and their exact details are not publicly known.

    Continue reading
  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading

Biting the hand that feeds IT © 1998–2022