Facebook's 2012 emotional contagion experiment triggered so much public indignation that people forgot to ask the obvious question: If Facebook has a science team, is it studying users as if they were lab rats all the time?
To be fair, The Wall Street Journal attempted an answer. It talked to a former Data Science team member, Andrew Ledvina, who said: "Anyone on the team could run a test. They're always trying to alter people's behavior."
The interesting question is how all these experiments are run -- apart from reducing the number of "positive" and "negative" posts from users news feeds to see how that changed the tone of their posts, as the Data Science Group's Adam Kramer did in the now-infamous experiment.
Vote Now: Do You Approve Or Disapprove of President Obama's Job Performance?
Facebook never tried to hide it was doing this kind of research. It just didn't advertise it. The best place to start looking for the Data Science team's insights is, you guessed it, the unit's Facebook page.
It links to a lot of innocuous research, such as this little study of how mothers on Facebook relate to their children or this one proving that true rumors are more viral than false ones. There is, however, plenty of more sensitive material, such as this study of how close Facebook friendships really are. The scientist behind it, Moira Burke, surveyed approximately 4,000 people about their relations with their friends and then matched the data to the server logs of the participants' Facebook activity. Burke says in her description of the study that all those data were anonymized, but the approach still raises unpleasant possibilities.
And then there are behavior-changing experiments, such as one that became public in 2012. During the 2010 U.S. congressional election, 98 percent of American users aged 18 and over were shown a "social message" at the top of their news feeds, encouraging them to vote and then report having done so by hitting a special button. Users could see which of their friends had done so. One percent saw the same message, but without the pictures of their friends. The remaining one percent saw nothing at all. The scientists found that the social messages worked best, and that the first message generated tens of thousands of votes, by matching their data to voting records.
The experiment was widely reported in 2012, after the team that ran it published the results in Nature. For some reason, nobody got mad about it, though one could imagine repressive regimes interested in 100-percent voter turnouts using the technology to smoke out dissenters.
Much of the Facebook research is published. The easiest way to unearth it is to take the names of scientists from the Data Science team page and run a specialized Google Scholar search for them.
Kramer, for example, suggested the number of positive words in Facebook posts as a measure of "gross national happiness" and analyzed Facebook users' "self-censorship" -- last-minute edits made after publishing a post (he found that people do it more for group posts). In 2011, he also argued that Facebook "holds potential to influence health behaviors of individuals and improve public health."
Facebook has no plans to stop this research activity. It has obvious commercial value, because it tells the social network how to facilitate user interaction (like this study by Moira Burke, focusing on newcomers' socialization on Facebook). The academic community loves it, too: Facebook scientists' research is widely quoted. Besides, it holds huge potential for governments seeking to gauge and influence the public mood, as in the election experiment or Kramer's public health example.
That may explain why Facebook Chief Operating Officer Sheryl Sandberg pointedly refused to apologize for the 2012 experiment, saying only that the company "regrets" it was "communicated really badly."
For Facebook users who don't want to be guinea pigs, the rule of thumb is to provide as little information about themselves as possible. That means getting an account under an assumed name and supplying false information on location, age, education, marital status and any other matters that the user considers private. This violates Facebook's terms of use, but there's nothing the social network can do about it. So many people are already doing this -- judging by my own news feed, at least -- that you have to question the value of all that research.
Urgent: Assess Your Heart Attack Risk in Minutes. Click Here.
© Copyright 2025 Bloomberg News. All rights reserved.