Facebook Alters Almost 700K User Feeds In Mind Warping ‘Emotional Contagion’ Experiment
Here’s another even more interesting but more disconcerting factoid: Researchers figured that out by running experiments. On Facebook. Without your knowledge or consent.
Here’s a snippet from the “Significance” section of the paper, which was published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS):
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
That’s almost 700,000 people that Facebook experimented on. “In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed,” reads the article’s abstract. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.”
Facebook HQ
The research itself is significant because, according to its authors, “emotional contagion” can happen not just in real-world interactions, but also from social media interactions. Thus, a social network could be a vehicle for massive, large-scale emotional contagion.
The results are fascinating, to be sure, but what of the research process by which Facebook--and it is Facebook here, as the lead author is Facebook’s Data Scientist Adam D.I. Kramer--gathered this data? The social network purposely manipulated the Newsfeeds of hundreds of thousands of people. Shouldn’t Facebook have had to notify those users that it was doing--something?
Research must be governed by an ethics board. University-based research is governed by the Institutional Review Board (IRB), which approves a given research team’s methods and procedures before they can proceed. This is to protect research subjects from abuses. (If you haven’t read about the Stanford Prison Experiment or the Milgram Experiment, do so now.)
Facebook would have to adhere to certain research ethics to conduct this study, but it’s not clear what ethics exactly. Because Facebook is not affiliated with a university, it would not have to bother with IRB approval; it’s possible that all Facebook had to do was tell the PNAS that it adhered to its own internal research ethics. Thus, whoever approves internal research at Facebook apparently felt that the social network’s broad data use policy, that all users sign, allowed them to conduct this experiment without asking users’ permission.
Of course, there’s always risk when researchers run tests. Ostensibly research ethics boards will ensure that test subjects are protected throughout. Did that happen in this case?