Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

This article is more than 1 year old

Facebook: Ha! Like we'd stop researchers messing with your mind

CTO: Backlash, new guidelines, blah blah. Oh look, a squirrel

Facebook has attempted to reassure users upset by the fact that it allows organisations to experiment on them by promising stricter guidelines for researchers.

The social network will still let organisations experiment on its users – it’s just going to make them go through an “enhanced review process” to do so.

Facebook CTO Mike Schroepfer said in a blog post that the social media firm was shocked to discover that people didn’t like it when they found out Facebook had preyed on their emotions for research purposes. He said that the company had had all the right reasons for wanting to do the study.

“In 2011, there were studies suggesting that when people saw positive posts from friends on Facebook, it made them feel bad. We thought it was important to look into this, to see if this assertion was valid and to see if there was anything we should change about Facebook,” he said.

“Earlier this year, our own research was published, indicating that people respond positively to positive posts from their friends.

“Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently.”

To stop people feeling violated again, Facebook says it has given researchers clearer guidelines and will review their proposals more closely if they’re aimed at particular groups or concerned with deeply personal things – like their emotions.

The panel making the call will include senior researchers and people who work for Facebook in engineering, research, legal, privacy and policy. The firm has also added some educational bits on research practices into its six-week training “bootcamp” for engineers, so they know what they’re reviewing.

“It’s important to engage with the academic community and publish in peer-reviewed journals, to share technology inventions and because online services such as Facebook can help us understand more about how the world works,” Schroepfer reckons.

“We want to do this research in a way that honours the trust you put in us by using Facebook every day. We will continue to learn and improve as we work toward this goal.” ®

 

Similar topics

Similar topics

Similar topics

TIP US OFF

Send us news


Other stories you might like