Facebook's 2012 emotional contagion experiment triggered so much public indignation that people forgot to ask the obvious question: If Facebook has a science team, is it doing something like that all the time, studying users as if they were lab rats?
To be fair, The Wall Street Journal attempted an answer. It talked to a former Data Science team member, Andrew Ledvina, who said, "Anyone on the team could run a test. They're always trying to alter people's behavior."
Thee interesting part, though, is how -- apart from reducing the number of "positive" and "negative" posts from users news feeds to see how that changed the tone of their posts, as the Data Science Group's Adam Kramer did in the now-infamous experiment.
In fact, Facebook has never tried to hide it was doing this kind of research. It just hasn't advertised it. The best place to start looking for the Data Science team's insights is, you guessed it, the unit's Facebook page.
It links to a lot of innocuous research, like this little study of how mothers on Facebook relate to their children or this one proving that true rumors are more viral than false ones. There is, however, plenty of more sensitive material, like this study of how close Facebook friendships really are. The scientist behind it, Moira Burke, surveyed about 4,000 people about their relations with their friends and then matched the data to the server logs of the partucipants Facebook activity. Burke says in her description of the study that all those data were anonymized, but the approach still raises unpleasant possibilities.
And then there are behavior-changing experiments like the 2012 one. During the 2010 U.S. congressional election, 98 percent of American users aged 18 and over were shown a "social message" at the top of their news feeds, encouraging them to vote and then report having done so by hitting a special button. Users could see which of their friends had done so. One percent saw the same message, but without the pictures of their friends. The remaning one percent saw nothing at all. The scientists found that the social messages worked best, and even showed that the first message generated tens of thousands of votes -- by matching their data to voting records.
The experiment was widely reported in 2012, after the team that ran it published the results in Nature. For some reason, nobody got mad about it, though one could imagine repressive regimes interested in 100-percent voting turnouts using the technology to smoke out dissenters.
Much of the Facebook research is published. The easiest way to unearth it is to take the names of scientists from the Data Science team page and run a specialized Google Scholar search for them.
Kramer, for example, suggested the number of positive words in Facebook posts as a measure of "gross national happiness" and analyzed Facebook users' "self-censorship" -- that is, last-minute edits made after publishing a post (he found that people do it more for group posts, when their audience is harder to define). In 2011, he also argued that Facebook "holds potential to influence health behaviors of individuals and improve public health."
Facebook has no plans to stop this research activity. It has obvious commercial value because it tells the social Network how to facilitate user interaction (like this study by Moira Burke, focusing on newcomers' socialization on Facebook). The academic community loves it, too: Facebook scientists' research is widely quoted. Besides, it holds huge potential for governments seeking to gauge the public mood or influence it, as in the election experiment or in Kramer's public health example.
That may explain why Facebook Chief Operating Officer Sheryl Sandberg pointedly refused to apologize for the 2012 experiment, saying only that the company "regrets" the fact that it was "communicated really badly."
To contact the writer of this article: Leonid Bershidsky at firstname.lastname@example.org.
To contact the editor responsible for this article: