Facebook is the second-most visited website in the world, behind only Google, with more than 800 million distinct visitors every day. Social networking has become an ubiquitous part of our day to day lives, a combination of our own time capsule and a window into our friends’ lives.
For better or worse, few websites have a stronger influence on our daily life than Facebook. In fact, one-third of 2011 divorce filings in the United States contained mentions of Facebook, according to Divorce Online.
With such a stranglehold on day-to-day life, Facebook has untold power to exert its influence in subtle ways. And last month, evidence of Facebook doing just that became news with the publication of “Experimental Evidence of massive-scale emotional contagion through social networks” in The Proceedings of the National Academy of Science.
Without knowledge or consent, Facebook manipulated the news feeds of nearly 700,000 users, reducing either positive or negative content and then examining the emotion of these individuals’ subsequent posts. The study found that emotional states can be transferred to others via “emotional contagion,” essentially suggesting that emotions and moods are contagious, and more specifically that such emotions and moods can be influenced by Facebook. If more people on a subject’s news feed posted negative content, the subject himself was more likely to post his own content displaying negative emotions, and the same was found to be true with more positive posts.
In the 1974 Tuskegee Syphylis study, African-American subjects with syphilis had their conditions tracked for decades, with the scientists never notifying them that penicillin had been discovered to be a viable cure. Telling them, scientists feared, would lead to their being cured of their disease, which would cause the scientists to lose their subjects.
In the aftermath, Congress passed the National Research Act, requiring informed consent of all research participants in studies using federal funds, as well as nearly all universities and pharmaceutical and biotech companies. Most of these organizations create their own institutional review boards, which review any proposed studies for ethical concerns prior to giving approval.
As a private company operating in social media, Facebook is not legally bound to these standards. But should they be?
One of the downsides to informed consent, from the perspective of researchers, is the possibility that potential subjects will choose not to participate in the study. And given the Big Brother-esque concerns that many people have, especially as it pertains to their data, such an experiment may be viewed as an excessive intrusion into our private data.
Today, social media can spark a revolution. In the wrong hands, could this knowledge push a country on the brink over the edge? Facebook already has a very strong ad-targeting system, and tracks users across the internet. This study suggests that Facebook’s knowledge and infrastructure could not only dictate our emotions, but incite something far larger.
Even when it’s not being actively manipulated, social media can still play an active role in conflicts around the world. In fact, American officials recently approached a large social media company requesting that they not remove the graphic content posted by Islamic State fighters because the content they post provides valuable intelligence in the form of their locations and strategies, among other things.
Facebook is playing with fire here, but can they control the blaze?