Facebook is playing with your emotions, and the social media company has come under fire for conducting a controversial experiment that may have affected people's moods. There's no way to tell if you were one of the users manipulated, but you still need to know what happened and why.
What Happened in the Experiment?
Facebook decided to alter the number of positive and negative posts people saw in their News Feeds. It did this to see how users' moods would be affected based on the statuses and the posts seen in their feeds. They expected that when people saw a lot of happy posts, they'd feel bad or left out and that when they saw too many negative posts, they'd avoid visiting Facebook. (See the full research paper here.)
What Were the Results?
Researchers found that people's emotions were reinforced by what they saw, something they call "emotional contagion" via social networks. So when users saw positive posts, they then shared more positive and fewer negative posts of their own. When users saw more negative posts, they shared more negatively skewed posts and fewer positive ones.
When Did This Happen?
Over the course of one week in 2012.
How Many People Were Affected?
Facebook altered 700,000 people's news feed — that's about 0.04 percent of its users.
Who Conducted It?
A data scientist at Facebook and researchers from UC San Francisco and Cornell University.
Why Is It So Controversial?
First of all, Facebook conducted the experiment without informing users. Many people say it's unethical that Facebook played with people's emotions without their consent. However, the study authors said Facebook's data use policy and terms of service, which all users agree to when they sign up, makes such experiments OK.
In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook "transmission of anger" experiment is terrifying.
— Clay Johnson (@cjoh) June 28, 2014
How Has Facebook Reacted?
Facebook has defended the experiment, saying the research was done to improve services. Adam Kramer, the Facebook data scientist in charge of the experiment, wrote a Facebook post in response to the backlash. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," he said. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
Do you think Facebook's experiment is as problematic as many users believe?