sheryl-sandberg-facebookI feel like we’ve written this since about 2007 or so, but Facebook really needs to work on its apologies.

In the company’s first official statement after its terrifyingly unethical study on the emotional effect the News Feed can have on its users, Facebook COO Sheryl Sandberg apologized for the way it communicated the research to the world. But she also defended the study as “ongoing research companies do to test different products” — which is pretty much the same sentiment that Facebook spokespeople and supporters have been expressing since the backlash began late last week.

“It was poorly communicated,” Sandberg said. “And for that communication we apologize. We never meant to upset you.”

Well, actually, in the case of a few hundred thousand users unwittingly included in the study, that’s exactly what Facebook intended to do. As part of the study, 300,000 Facebook users had their  News Feed changed to include more negative items, with the specific goal of upsetting them.

It’s a classic example of a non-apology meant to placate consumers without Facebook accepting any responsibility for its actions.

The idea that Facebook was merely tweaking the News Feed to make it a better product is just ludicrous, as I wrote in a blog post countering that argument from the company’s supporters:

The problem is that conflating Facebook’s experiment with the changes tech companies make to countless other algorithms every day is a bit like saying that a doctor shoving poison pills down a person’s throat is okay because other doctors were writing legitimate prescriptions at the same time. It’s a willful misrepresentation of a disconcerting experiment that should be discussed instead of being accepted as part of the modern, algorithm-driven world in which we’ve decided to live.

This wasn’t a simple tweak to News Feed’s algorithms. It also wasn’t an example of researchers using the data that companies like Facebook collect to improve their products or better serve advertisements. If the company had just given the researchers access to anonymized data that didn’t result from a product change made specifically to have an emotional effect on its users, or if it had asked for consent before experimenting on them, things would be a little different.

Put another way: This wasn’t analysis of existing data. I’m not suggesting that this study has made people more wary about the vast amounts of information companies like Facebook have amassed over the years. (They should be worried about that, but not because of this study.) It was a study in which Facebook had a hypothesis (that the News Feed can affect emotions) and a way to test it (changing the News Feed and seeing what happens). That is an experiment.

Facebook can’t pretend that this was a test to improve News Feed. It can’t claim that its users had given their permission to be experimented upon when they signed up for its service. And it can’t even claim that it attempted to protect those users, as it didn’t even bother to exclude minors from the study. This was an experiment designed to affect the emotions of hundreds of thousands of people without their consent with no regard for their wishes, ages, or histories.

That’s what Sandberg should be apologizing for. Not for the way Facebook has talked about the study. Not for the fact that people are upset about knowing that the company manipulated their emotions without telling them. (It still hasn’t told the affected users that their News Feed was changed as part of the experiment, by the way.) Not for gathering unfathomable amounts of data. For experimenting on people and allowing Facebook’s culture of contempt to reach the point where it can’t even admit to itself that this research was more than just a product tweak.

[illustration by Brad Jonas for Pando]