Facebook Experiments with User Emotions, Inciting Panic

118

Willingly or not, you may have been a lab rat to Facebook for seven days. 689,000 users’ accounts were ‘experimented’ by Facebook, the results of which were published in a study called “Experimental evidence of massive-scale emotional contagion through social networks”.

Essentially, the content feed algorithms of these users were changed to highlight different emotional content. In this study, three Facebook data scientists, analyzing post content and building on the results, pointed out that social networks can play with your mood.

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” study authors Adam Kramer, Jamie Guillory, and Jeffrey Hancock wrote. “We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.” fb2 In plain English, they say that emotions can be spread – the more negative posts a user sees on their page, the more negative they are likely to be in their own posts, and a similar effect applies for positive posts. Now if you are shocked to hear that your privacy may have been invaded, relax:  Facebook claims that no data was revealed to researchers and the whole process was a machinery run.

This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

A certain amount of panic flew around the web, with many writers questioning where research ends and where full-on privacy violations begin and pointing out that according to Facebook’s Data Use Policy, you did agree to be their guinea pig the day you signed on. Unfortunately, here’s what everyone – from Time Magazine to Forbes – missed out on: the study was done a long time ago.In response, co-author Adam Kramer responded thusly:

OK so. A lot of people have asked me about my and Jamie and Jeff‘s recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.

At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper. Regarding methodology, our research sought to investigate the above claim by very minimally de-prioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.

 And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.    The goal of all of our research at Facebook is to learn how to provide a better service.

Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here