Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”
This all came to light when Facebook published the results of their manipulations in the current issue of the Proceedings of the National Academy of Sciences (PNAS), in a paper entitled Experimental evidence of massive-scale emotional contagion through social networks.
According to Facebook’s lead researcher on the “let’s manipulate our users’ emotions” study (our name for it, not theirs), Adam D. I. Kramer (nickname “Danger Muffin”), “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.”
Really? Does anybody buy that Facebook deeply cares about their users’ emotions? Remember, Facebook’s users are a commodity to be monotized. One only has to follow the trail of money – and in this case, this sort of research can lead to all sorts of information that can be used to boost the revenue generated by a user’s response to a particular sort of post or advertisement.
As to how they did it, Kramer goes on to explain in a note that he posted after the storm broke, that “our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012).”
In the summary to the published research, Facebook states that “We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
Opined Christian Sandvig, a Professor of Communications Studies and Information at University of Michigan, in an interview with the Huffington Post UK, “There is a big difference between our expectations for academic social science and our expectations for Facebook. And that difference is reasonable,” adding that what Facebook did is “not an ethical research design.”
Tech analyst Brian Blau, in an interview with the New York Times, was more blunt. “Facebook didn’t do anything illegal, but they didn’t do right by their customers. Doing psychological testing on people crosses the line.”
Facebook justifies their not explicitly getting user permission by saying that users agreed to it when they agreed to the Facebook Terms of Service (TOS).
However, there are two other researchers listed on the paper, Jamie Guillory of UC San Francisco, and Jeffrey Hancock of Cornell University. Presumably, at least, these two are required to run any experiments conducted on humans by an institutional review board (IRB).
And even the PNAS editor who edited the Facebook study had qualms. In an interview with The Atlantic, Princeton Professor and PNAS editor Susan Fiske said “I was concerned until I queried the authors and they said their local institutional review board had approved it.”
In the abstract of their paper, Facebook asserts that “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”
Facebook has amply proved that emotional states can be transferred, however in this case it’s directly by Facebook’s actions.
|Get notified of new Internet Patrol articles! |
You might also like some of our other articles: