Home / Features / Facebook Study Raises Questions on Ethics
privacyfb

Facebook Study Raises Questions on Ethics

A recent paper published by Facebook has raised issues regarding the social network’s use of unwitting users in its studies. The paper, published on the Proceedings of the National Academy of Sciences, has found evidence that emotional states can be transferred through ‘emotional contagion’.

To conduct this study, the researchers connected with Facebook and various US-American universities, modified the algorithm responsible for arranging the news feeds of almost 700 000 users to test whether changing the number of positive or negative messages people saw made those people less likely to post positive content themselves.

According to the paper, the study, conducted in January 2012, was consistent with Facebook’s Data Use Policy, which all users agree to prior to creating an account on Facebook, which they allege constitutes informed consent for this research. This raises some ethical issues regarding the study.

Here is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research, and service improvement.”

According to a 2011-2012 comparison of Facebook’s Data Use Policy, however, Facebook only added ‘research’ to its user agreement 4 months after the study in May 2012.

Four months after this study happened, in May 2012, Facebook made changes to its data use policy.
Four months after this study happened, in May 2012, Facebook made changes (in red) to its data use policy.

“When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer,” a Facebook spokesperson said to Forbes’ Kashmir Hill. “To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not.”

According to Professor Mathias Göbel, Chair of Rhodes University Ethics Committee, Facebook’s  Data Use Policy does not specify that users will be involved in active experiments by testing their response to manipulated scenarios. “I use the term manipulated, as users normally expect to be exposed to the same type of ‘reality’ as all other users, but in fact they were not,” said Professor Göbel. Facebook did not use user data for observation purposes (processing statistics etc), but faked the situation users find themselves in. So, at least users would need to have consented that what Facebook displays to them might be individually manipulated for the purpose of research.

James Grimmelmann, a professor of law at the University of Maryland in the USA, stated that, “If you are exposing people to something that causes changes in [their] psychological status, that’s experimentation. This is the kind of thing that would require informed consent.”

Rhodes University law lecturer, Ms Helen Kruuse doubts that ticking a box when you sign up for Facebook would constitute informed consent in South African law. She further raised the issue of minors being potentially used in the study, stating “If the study involved children, then precautions should have been taken.”

Professor Göbel noted that although it is unethical to involve individuals into active research (that goes beyond using available data in an aggregated way for such things like statistics) without explicit consent, “one must also realise that some research is only feasible with participants not knowing exactly to what they are exposed, as this would affect their behaviour and thus make the study obsolete.”

According to the paper, the study was approved by Cornell Ethics Review Board. This is the body which considers all research done by academics where human or animal interaction is required. If the Board allowed this research to take place, Ms Kruuse stated she would “suspect they would need to explain the issues of informed consent, possible participation by minors, ex post facto application (where the research is already completed) and also the importance of using deception as a means of extracting findings from the study (here – deception in the manipulation of messages).”

Study co-author Adam Kramer explains that Facebook was worried people would stop visiting the social network if they saw too many emotional updates; a lot of negative posts could scare some people off, while a surge of positive vibes would leave others feeling left out. That’s not what happened, however, and Kramer stresses that the company “never [meant] to upset anyone.”

The PNAS Paper
Facebook’s User Agreement