facebook_knife

Here’s a surprise: Sheryl Sandberg’s non-apology for the experiment in which Facebook tweaked nearly 700,000 people’s News Feeds to determine the effect it might have on their emotions hasn’t stopped the condemnation of the study from privacy groups and the editor-in-chief of the journal that included the findings. EPIC, a privacy advocacy group, has filed an official complaint with the Federal Trade Commission to seek an investigation into the study. Meanwhile, the editor-in-chief of PNAS has published a statement in which she says that Facebook’s study might not have respected ethical standards.

EPIC argues in its complaint that Facebook “purposefully messed with people’s minds” with an experiment made possible by the “secretive and non-consensual use of personal information.” The group alleges that this violates the FTC’s 2012 Consent Order with Facebook and Section 5 of the FTC Act. It requests that the FTC impose sanctions on Facebook and require it to “make public the algorithm by which it generates the News Feed” to repent for the alleged violations.

The statement from PNAS editor-in-chief Inder Verma isn’t quite as drastic. It reads, in part:

Obtaining informed consent and allowing participants to opt out are best practices in most instances under the Department of Health and Human Services Policy for the Protection of Human Research Subjects (the “Common Rule”). Adherence to the Common Rule is PNAS policy, but as a privacy company Facebook was under no obligation to conform to the provisions of the Common Rule when it collected the data used by the authors, and the Common Rule does not preclude their use of the data. Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper. It is nevertheless a meter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.

This statement echoes concerns from other researchers who have told the Guardian, the Wall Street Journal, and a number of other publications that Facebook’s decision to act without obtaining consent or excluding minors from its research places the company in an ethically dubious position at best. This means that many concerned by the study’s human aspect are questioning its ethical standing even as others focus on Facebook’s right to experiment on its users, as I wrote in a blog post considering the mindset of the people defending the company:

It’s clear why anyone concerned about the ethics involved with manipulating the emotions of hundreds of thousands of people without explicit consent view Facebook’s experiment as a mistake. So why do some people insist on defending Facebook’s supposed right to experiment on its users, despite being told that checking a box on a Terms of Service agreement isn’t the same as agreeing to participate in a psychological experiment?

Perhaps it’s because the Valley culture has made it easier to think of millions or billions of people as little more than data points used to draw revenues from advertisers. Because this study technically required just a few tweaks to an algorithm, it’s easy to view it as a harmless experiment meant to gather more data, which has become increasingly sacred to the Valley. It’s hard to think that a billion data points is actually composed of individual people who might worry about Facebook’s ability to manipulate their emotions instead of bits and bytes.

Facebook and its supporters can claim that it conducted this experiment — and hundreds of other research experiments performed with few guidelines, according to the Journal — with the intention of creating a better News Feed. It can apologize for people being upset about the study instead of for the study itself. It can hide behind the blanket of innovation and refuse to respond to questions that have nothing to do with its rights as a technology company in favor of the rights its users have as living, breathing people instead of a constellation of data points. But as EPIC, a number of investigators in Europe, and PNAs show, the backlash isn’t ending.