If you're a Facebook user, you may have caught some bad feelings, brought to you by a mechanism that researchers call “emotional contagion.” We learned last weekend that researchers found that some Facebook users subjected to predominantly good news feeds tended to post good news themselves, while others subjected to bad news tended to post bad news. And news of the experiment that led to this unsurprising result has been decidedly bad news for Facebook. A British MP has called for an investigation into how social networks manipulate users' emotional and psychological responses, according to the Guardian.
But critics of the Facebook research may be overreacting. Warwick Business School Assistant Professor of Behavioral Science Suzy Moat said in an email, “In many ways, this experiment is simply a public example of the experiments that many businesses run on a regular basis to investigate how they can influence our behavior.”
She continued, “For example, Facebook and Amazon constantly experiment with showing different groups of people slightly different versions of their websites to see if one version is better at increasing the frequency with which users engage with the content, click on adverts, or buy products. You can opt out of these 'experiments' by not using the websites—but many people don't want to do this, as the services offer such value to them.”
“So it's interesting that there's such outrage about this, but not about the experiments which many online businesses run on all of us on a day-to-day basis—possibly because a lot of people simply don't realize that they're happening.”
The Facebook experiment, if conducted today, would appear to be in compliance with the user agreement Facebook's users must accept, which gives the company the right to use data for troubleshooting, analysis, testing, and research. But Forbes reports that the clause was added months after the research was completed.
Dr. Moat noted that the expectation is that “…scientific experiments are generally supposed to be run for public good, not for business interests, and it's obvious that many people currently feel that this experiment was not in their best interest.”
She sees pros and cons from the research, noting, “On one hand, the experiment has already done a lot of public good, by clearly demonstrating how small changes on a widely used service can affect the behavior of large numbers of people—something which many people may not have previously realized. Crucial to this realization is the fact that the methods and results have been made publicly available, in contrast to experiments run for business purposes.
“On the other hand, it's extremely understandable that many people are upset that their behavior may have been manipulated for purely scientific purposes without their consent. In particular, Facebook's user base is so wide that everyone wonders if they were in the experiment.”
Facebook COO Sheryl Sandberg has apologized, saying the research was poorly communicated and not meant to upset anyone. And Facebook data scientist Adam D. I. Kramer said he and his fellow scientists were trying to investigate whether seeing friends post positive content would leave to others feeling left out, or whether exposure to negativity would lead people to avoid Facebook. He acknowledged that the researchers didn't clearly state their motivation in their paper.
Dr. Moat concluded, “This highlights a need for scientists to think back to how experiments in the social sciences have traditionally been run, and consider how open and transparent ethics procedures can be redesigned to take new technology such as Facebook into account. We also need to look at existing successful online experiments, such as Mappiness, which have recruited extremely large numbers of informed participants, simply by offering sufficient benefits for participating—just like a business does.”