Facebook recently published the results of a one-week study conducted in January 2012, which found that when the number of positive posts in a Facebook user’s News Feed were reduced, the user posted fewer positive posts and more negative posts. The social network cited this finding, among other findings in the same research, as evidence that social networks can spread “emotional contagion”, that is, the wide-scale transfer of positive or negative emotions between users.
Facebook’s researchers selected 689,000 users for the project and manipulated their feeds to include a higher proportion of positive or negative messages. Using Linguistic Inquiry and Word Count Software, Facebook anonymously analyzed the 3 million posts from these users, which contained 122 million words, 4 million of which were positive, and 1.8 million of which were negative. None of the words were actually seen by researchers, and Facebook justified the experiment by saying that users’ agreement with the social network’s terms and conditions when creating their account constituted “informed consent on this research”.
But, with this research, Facebook has skated too close to the edge of online users’ willingness to share data. Users do accept that any data that they share online is likely to be analyzed and studied by the company with which it is shared, typically for the purposes of the company providing a better service or product. What users don’t typically expect is that their “informed consent” would include their willingness to participate in what was essentially a sociological study that comprised the manipulation of their emotional health.
The way in which the study was conducted could have had tragic consequences for those unwitting participants who may already have been suffering poor mental health, but who were randomly selected to receive a higher proportion than usual of negative posts. In the UK and in the US, for example, about one in four people will suffer from a mental illness in any given 12-month period. Given the nature of this particular research, Facebook should have sought explicit consent from participants prior to embarking upon it.
Adam Kramer, one of the Facebook researchers who conducted the study, has at least had the good sense to quickly issue an apology for the study, now that it has been made public, and states that the company has since improved its “internal review practices.” In his own words: “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
But the irony is that the way that Facebook has tried to address its concerns about whether users would avoid visiting Facebook because of exposure to friends’ negativity, the likelihood is that a proportion of Facebook users may now well avoid visiting the site because of their own concerns about whether they will be unknowingly involved in another potentially harmful social experiment. This is an outcome that could have been avoided had Facebook been more transparent at the outset.
Pamela Clark-Dickson is a senior analyst for consumer services at Ovum. For more information, visit www.ovum.com/