If you’ve paid attention at all to recent events lately, you know that people are not exactly in love with Facebook at the moment. Recent news reports have uncovered that the social media giant conducted a “massive psychological experiment” in 2012 on nearly 700,000 of its users.
In short, Facebook used an algorithm to manipulate the newsfeeds of hundreds of thousands of users so that some would only see “positive” updates, while others would be subjected to more “negative” posts. The “experiment” lasted for one week and recorded how often those users posted updates that were similar in nature. It seems researchers wanted to test the “contagion effect” and determine whether or not your friends’ posts affect your mood.
The findings? Those who saw more negative content were more likely to publish less positive content of their own and vice versa. That doesn’t seem too surprising – humans are naturally social and empathetic beings.
Regardless of the experiment’s outcome, people are now furious, calling the study cruel “mind-games” and “emotional manipulation”. Others have questioned the ethicality of gathering this kind of data without a users’ knowledge or consent. The Electronic Privacy Information Center (EPIC), a public interest research center based in Washington D.C., has even gone so far as to file a complaint with the Federal Trade Commission, claiming Facebook deceived its users and violated a 2012 Consent Order.
But did Facebook do anything wrong? According to MarketWatch, the social network didn’t break any laws. In fact, by agreeing to Facebook’s terms and conditions, you are automatically offering yourself as a willing participant for the site’s research and testing efforts. Always read the fine print, right? One of Facebook’s former data scientists that participated in the study (the same scientist who also let word of this study slip to the media) recently admitted that every user has been a part of an experiment at some point or another while on the site.
Meanwhile, Facebook executives are keeping apologies to a minimum. The social network’s chief operating officer Sheryl Sandberg has gone on record stating that while the study was “poorly communicated,” the experiment itself was not a mistake.
We’ve read plenty of analysis from social, psychological and law experts, and it seems people are truly divided on the issue. There are even differing opinions among the KCD PR team. So we want to know: What do you think? Did Facebook overstep its bounds? Or are these big data studies now the justifiable norm? Will this news affect new membership or cause users to delete their accounts? How will the overtly cautious social media users of the financial services industry respond? Head over to our Facebook, Twitter and LinkedIn pages to join the conversation! We’d love to hear what you have to say.
For those of you who are curious about how being subjected to this study felt, Google Chrome has recently created a plugin that will allow you to manipulate the emotional variety of your own newsfeed. Test it out and let us know what you think!