Facebook knows more than you think.
Facebook knows more than you think.

How shocking: Facebook had the temerity to conduct an experiment on its users without telling them, and now the results are published in the Proceedings of the U.S. National Academy of Sciences. I am not, however, shocked, and you shouldn't be, either.

For a week in 2012 Adam Kramer, the social network's staff scientist, and his two collaborators used algorithms to doctor the news feeds of 689,003 English-speaking Facebook users. They reduced the number of posts containing "positive" and "negative" words, then tracked their lab rat users' own posts and found that their mood followed that of the news feed. The term, well-known to psychologists studying real-world communications, is "emotional contagion".

If Kramer's week-long experiment appears outrageous, what Facebook does on a daily basis -- also without specifically asking its users -- is monstrous in the extreme. An algorithm called EdgeRank scores each post on a number of criteria, such as how frequently a News Feed owner interacts with its author and the quality of that interaction (a comment is more valuable than a "like"). The higher-ranked posts go to the top of the feed. That's why a typical user doesn't see everything her friends are posting -- just what Facebook decides they'd be interested in seeing, plus paid advertising (whih is also supposed to be targeted). You can tweak the settings to make posts appear in their "natural" order, but few people bother to do it, just as hardly anyone ever reads Facebook's data use policy: buried among these 9000 words, there is a sentence that says research is a legitimate use.

In other words, on Facebook, one can opt out of having a machine decide what content you will find engaging. Twitter, by contract, allows users to opt in by using the so-called Discover feed. I find the opt-in tactic more honest, but, predictably, it's less effective from a marketing point of view.

Facebook manipulates what its users see as a matter of policy. It makes no secret of the fact that it does, and the publication of Kramer's research is evidence of that. Academics may discuss whether the users give their informed consent to that, but common sense tells us that a lot of people choose to stay uninformed. Many will actually believe what they hear on television. Others know they are being tracked, and experimented upon, by the likes of Facebook and Google -- and do not mind. It's par for the course on the social web. "Run a web site, measure anything, make any changes based on measurements? Congratulations, you're running a psychology experiment!" venture capitalist Marc Andreessen, who sits on Facebook's board, tweeted.

People who hate this have the option of not using Facebook and perhaps resorting to palliative therapy by going with a network that is less invasive in its attempts to leverage its user base. It's like unplugging that TV or quitting smoking: so easy, and yet so hard.

The good news for those who want their social networking fix regardless is that Facebook only has a limited ability to influence our emotions. "At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week," Kramer explained in a Facebook post on Sunday.

Kramer, as the person who designed the experiment, is actually overoptimistic about it. The algorithm used to check for positive and negative words, Linguistic Inquiry and World Count, is one of many primitive attempts to map natural language onto mathematical criteria. These programs see the word "happy" as positive and the word "sad" as negative, but they have no ability to detect sarcasm, which should be essential in analyzing social network content, and they have a limited grasp of grammatical constructs that sometimes turn word meanings on their head. Psychologist John Grohol wrote that LIWC would assess the sentence "I am not having a great day" as neutral because it has both "great" and "not".

As with a lot of Facebook data, which the enormous network has no ability to police, it's garbage in, garbage out.

If the U.S. Army Research office actually took part in funding the research -- as Cornell University, whose Jamie Guillory was one of Kramer's co-authors, initially reported (and then corrected itself to say there had been no external funding), -- the military did not get access to a powerful mood-changing tool. People may be lazy about reading policy documents and resisting corporate manipulation, but they are more complicated than their Facebook accounts.

This column does not necessarily reflect the opinion of Bloomberg View's editorial board or Bloomberg LP, its owners and investors.

To contact the author on this story:
Leonid Bershidsky at lbershidsky@bloomberg.net