Facebook Research Dances Around Informed Consent
The title of the research paper is certainly scholarly. In "Experimental evidence of massive-scale emotional contagion through social networks," the results of a "massive experiment on Facebook" were published in The Proceedings Of The National Academy Of Sciences. They showed that moods can spread on the network like a disease by exposing some users to more positive news stories than usual, and others to more negative stories.
"For people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive," the paper notes. "When negativity was reduced, the opposite pattern occurred."
The mainstream media had fun with the story. "Facebook emotions are contagious!"
But as the story spread online, and notably after this report in The A.V. Club, actual researchers took notice. And many are upset.
The problem is "informed consent," a fundamental principle of research involving human subjects. While it can get complicated, it basically means researchers must meet three requirements:
Disclosing to potential research subjects information needed to make an informed decision;
Facilitating the understanding of what has been disclosed; and
Promoting the voluntariness of the decision about whether or not to participate in the research.
This critical issue is summarily dismissed by saying that all Facebook users agree to be studied simply by using Facebook. The study addresses the matter in one sentence fragment: "...It was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."
And indeed, Facebook's Data Use Policy does mention research: "...In addition to helping people see and find things that you do and share, we may use the information we receive about you for internal operations, including troubleshooting, data analysis, testing, research and service improvement."
But many say that's not enough.
"This study is in violation of laws regarding Human Subject protocols in research," writes Gwynne Ash in a comment on The A.V. Club story. Ash, a professor at Texas State University, goes on to say:
"In this study there was no disclosure to participants that they were members of a research study, even though the purpose of the study was to produce negative emotional states, such as depression, through specific manipulation of data provided to participants (i.e., this was not a naturalistic study). The blanket research permission that is part of the Facebook TOS in no way approaches ethical appropriateness for human subject research of this type. There was also no debriefing of study participants. The publication of this study breaches all accepted protocols for the protection of human subjects in experimental research..."
Aimee Giese, someone I've followed online since 2009, put it much more succinctly: "there is NO WAY Facebook did not violate human subjects rules."
Apparently, informed consent rules don't officially apply to private companies conducting their own research, but while a Facebook employee was the lead researcher, there were co-authors affiliated with institutions of higher education — University of California, San Francisco and Cornell University — that most certainly adhere to the requirement.
At the University of Hawaii, the Human Computer Interaction Lab leads a lot of research into social computing. Lab director Scott Robertson, who is also an Associate Professor of Department of Information and Computer Sciences, shared his initial thoughts with me.
"My opinion is that what Facebook did here is unethical, but it is a fuzzy boundary," he said.
"For example, Facebook (and others) conduct so-called A/B studies all the time where they present different interfaces, or different ads, or use different algorithms to different customers and measure things like time spent on the page, click rate, buying, et cetera," Robertson explained. "If you think about it, they are purposely manipulating the experience and emotions of users in these situations as well, but somehow this seems OK to me."
"This is a bit of a new frontier, and we will see a lot of this type of thing in the future," he added.
I don't know if I was one of the hundreds of thousands of Facebook users included in this study, but I definitely feel manipulated and grumpy. So either way, their test worked.