Facebook has hit back at its critics after the social network instructed researchers to meddle with its users' "news feeds" in order to manipulate their emotions.
The free-content ad network sparked anger when it emerged its data scientist Adam Kramer gave a green light to researcher to filter out positive and negative posts seen by 700,000 people to see how they would react.
A website spokesman said, when quizzed by The Register, conveyed the impression that the social network believed that deliberately making its users unhappy was worth it as it made content "more relevant and engaging":
This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.
A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process.
There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.
Facebook's spokesman then pointed us towards Kramer's "apologetic response" and was keen to note that the study had been "conducted more than 2.5 years ago".
The Mark Zuckerberg-run company appeared to be suggesting that its experiments with Facebook users were less ethically ropey than they had been in the past.
Nonetheless, sorry seems to be the hardest word for an emotionless Facebook. Funny that. ®