European agencies are starting to question the effect Facebook’s psychological experiment might have had on consumer privacy following an intense backlash over the study, which did not seek consent from the nearly 700,000 unwitting participants. The agencies told the New York Times they have not started an official investigation.
Pando’s David Holmes wrote about the unethical — and disturbing — aspects of the study after it was first published. In addition to describing the contempt Facebook has for its 1.2 billion users, Holmes considered the real-world effect these social experiments could have on people:
With over one-tenth of the world’s population signing into Facebook every day, and now with evidence to back the emotional power of the company’s algorithmic manipulation, the possibilities for widespread social engineering are staggering and unlike anything the world has seen. Granted, Facebook’s motives probably are simply to convince people to buy more stuff in order to please advertisers, but the potential uses of that power to impact elections or global trade could be enticing to all sorts of powerful interest groups.
The response to this study has only gotten worse. Researchers, including one who edited the study, said that it might have been unethical. The Wall Street Journal revealed that Facebook didn’t exclude users under the age of 18 from the experiment. The New York Times published an op-ed in which the experiment was compared to a drug company fiddling with its product. Our ignorance was Facebook’s bliss, but now we’re learning more about the experiment every day.
That includes the fact that Facebook didn’t receive even implicit consent from its users before the experiment was conducted. It was previously thought that the company’s Terms of Service allowed it to conduct “research,” but it didn’t add that to its agreement until months after the study took place. As I wrote after Forbes revealed this damning information earlier this week:
As more information about this study is revealed, the arguments used to defend it become less sustainable. Facebook didn’t just tweak an algorithm for a study that its users agreed to when they decided to create an account for the service; it conducted a psychological experiment on hundreds of thousands of people, some of whom might have been minors, without any form of consent. Now it’s trying to characterize the study as a simple attempt to improve its service — an argument that demonstrates the contempt it holds for every one of its 1.2 billion users.
The sheer amount of troubling information about this study hasn’t stopped some people from assuming the indefensible position that Facebook should be allowed to experiment with its users’ emotions. After all, they say, all it did was tweak an algorithm, which it does every day. That argument isn’t just amazingly disingenuous or misleading, it’s also downright terrifying:
Still, it’s hard to imagine why people outside the company would defend its actions. Maybe they also don’t care about other people. Maybe they’ve become so jaded that the idea of a technology company performing psychological experiments seems like a normal Saturday. Or maybe they want to give Facebook the chance to make everyone feel like they do, thus ending the criticism they face for suggesting that Eric Schmidt should be placed in charge of an entire goddamn country. Take your pick — no matter which motivation you choose, the truth is that we must now live in a world where Facebook’s right to unethical experiments is defended.
Now at least the European government is starting to look into the study and its ramifications. It’s unclear how the questioning will end, as the issue is about the emotional effect Facebook can have on its users instead of the privacy implications for people who unknowingly shared information with researchers. But it’s heartening to see that tech reporters aren’t the only ones looking into an unethical psychological experiment conducted on more than 600,000 people.
[Illustration by Hallie Bateman for Pando]