Facebook Experiment Secretly Manipulated EmotionsCecily Kellogg
Do you ever have those days scrolling through your Facebook feed when you think, “Goodness, EVERYONE is so upset today!” Turns out? A Facebook experiment might have been playing you — your emotions, that is.
In January of 2012, Facebook teamed up with researchers from the University of California and Cornell to see if by manipulating the feed of over 700,000 people, they could influence what those same people post. Some folks were shown only negative or sad updates, while other only saw happier news. And, shockingly, people responded like, well, people and shared similarly to the feeds they read.
Or, as the study puts it:
“We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
Was what Facebook did illegal? Probably not, although Slate feels Facebook is being a little fast and loose with its terms of service. It is considered, however, highly unethical to perform experiments on individuals that haven’t given informed consent, and in fact, the journal the study is published in follows the Helsinki protocol which this study violate (also according to Slate).
While most of us have become accustomed to the inherent lack of privacy that comes with using Facebook, and grudgingly accepted Facebook’s right to modify what appears in our newstream, it’s quite another thing entirely to discover that Facebook is deliberately experimenting on us. And it feels downright creepy.
According to this article in Forbes, Facebook claims the research was done for internal use only – which of course is completely untrue considering it was an academic study that may have had federal funding. And this assessment from the Forbes article is particularly chilling:
“One usable takeaway in the study was that taking all emotional content out of a person’s feed caused a ‘withdrawal effect.’ Thus Facebook now knows it should subject you to emotional steroids to keep you coming back. It makes me wonder what other kind of psychological manipulation users are subjected to that they never learn about because it isn’t published in an academic journal.“