sad-smartphone

If you were still unsure how much contempt Facebook has for its users, this will make everything hideously clear.

In a report published at the Proceedings of the National Academy of Sciences (PNAS), Facebook data scientists conducted an experiment to manipulate the emotions of nearly 700,000 users to see if positive or negative emotions are as contagious on social networks as they are in the real world. By tweaking Facebook’s powerful News Feed algorithm, some users (we should probably just call them “lab rats” at this point) were shown fewer posts with positive words. Others saw fewer posts with negative words. “When positive expressions were reduced,” the paper states, “people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The results shouldn’t surprise anybody. What’s more surprising, and unsettling, is the power Facebook wields in shifting its users’ emotional states, and its willingness to use that power on unknowing participants. First off, when is it okay to conduct a social behavior experiment on people without telling them? Technically, and as the paper states, users provided the consent for this research when they agreed to Facebook’s Data Use Policy prior to signing up, so what Facebook did isn’t illegal. But it’s certainly unethical.

Furthermore, manipulating user emotions in a digital space comes with uniquely disturbing consequences. In the real world, if you feel like the people around you bring too much negativity into your life, the solution is easy: Find a new crowd. But on Facebook, short of canceling your account, this is impossible to do if the company suddenly decides, whether as part of a research study or at the behest of certain advertising or engagement interests, to start sending more negative content your way. The whole point of the News Feed algorithm, to hear Facebook tell it, is to give users an experience tailored to their wants and interests. Clearly, that objective falls by the wayside anytime Facebook wants to turn its user base into a science experiment.

And then there’s the tone deaf gall of the whole thing: This research wasn’t uncovered by an investigative reporter, Facebook submitted the research to PNAS themselves. To make matters worse, there are questions about whether the methodology used was even sound. To determine “positive” and “negative” sentiments, the researchers used a technique called “Linguistic Inquiry and Word Count” or LIWC. But even the creators of LIWC admit that assessing its validity when applied to “natural language” (like a Facebook update) is “tricky.” LIWC’s reliability has largely been tested by analyzing essays, where there is more repetition than in natural language.

Perhaps I’ve been watching too much Black Mirror, but my brain can’t help but extrapolate on some of the alarming potential uses of this power. Psychological warfare techniques, like gaslighting, have long been used by government agencies to create cracks in the psyches of political dissidents or other undesirables. Assuming the ties between government organizations and tech companies continue to strengthen (and we’ve already seen Facebook cave to government pressure before), what’s to stop the NSA from manipulating what content a person sees in their News Feed in a manner designed to drive them to insanity? It might not be that hard to do: If every time you opened Facebook, all you saw were ex-girlfriends, old friends who are more successful than you, and upsettingly extreme political rants from family members, that might be enough to drive a person mad.

It doesn’t have to be the government pulling the strings either — Facebook itself could target certain users, whether they be corporate rivals or current/former employees. Having such strong psychological control over your workforce would certainly have its benefits. And if Facebook ever gets caught? Why, the company could claim it’s all part of a social experiment, one that users tacitly agreed to when they signed up.

With over one-tenth of the world’s population signing into Facebook every day, and now with evidence to back the emotional power of the company’s algorithmic manipulation, the possibilities for widespread social engineering are staggering and unlike anything the world has seen. Granted, Facebook’s motives probably are simply to convince people to buy more stuff in order to please advertisers, but the potential uses of that power to impact elections or global trade could be enticing to all sorts of powerful interest groups.

Or maybe I’m just being paranoid. Hey Facebook, can you please crank up the happy meter on my News Feed so I can enjoy the rest of my weekend in peace?

[illustration by Brad Jonas for Pando]