sadandroid

One of the biggest points raised by defenders of Facebook’s social emotional manipulation experiment may no longer hold water.

Facebook’s defenders can no longer use its Terms of Service agreement to absolve the company of any ethical quandaries over the experiment, which attempted to influence its users’ emotions through News Feed algorithm tweaks. Forbes reports that the company didn’t add the line about using customer data for “research” until four months after the study was conducted — which means that the company didn’t have informed, explicit, or implicit consent for a psychological experiment affecting over 600,000 people.

A company spokesperson has defended its decision to conduct the study despite the fact that it didn’t have any form of consent from its subjects to a variety of publications, including Forbes:

When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer. To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not.

But Facebook’s problems won’t end with the revelation about its Terms of Service agreement. The Wall Street Journal reports that the study included users under the age of 18, which will probably create more controversy for the company — especially considering a number of its users lie about their age, meaning they could be even younger than anyone thought. The Journal also says that a number of researchers, including one who edited the study, had concerns about the study’s ethical ramifications because of the number of people it affected.

And that doesn’t even include concerns about how this might be only the first of many studies meant to influence people through small tweaks to existing products, as Jaron Lanier explains in a New York Times op-ed:

Now that we know that a social network proprietor can engineer emotions for the multitudes to a slight degree, we need to consider that further research on amplifying that capacity might take place. Stealth emotional manipulation could be channeled to sell things (you suddenly find that you feel better after buying from a particular store, for instance), but it might also be used to exert influence in a multitude of other ways. Research has also shown that voting behavior can be influenced by undetectable social network maneuvering, for example.

It’s clear that this study has made people more concerned about the effect companies like Facebook, which wield a measure of control over how we interact with the digital world, can have on our lives. Yet some people continue to argue that the site was simply tweaking an algorithm and working to optimize its network to give users a better experience, not conducting an experiment for which it should have received prior consent, as I covered in a blog post on Monday:

[There are] people who also believe that Facebook should be allowed to “tweak its algorithms” — which is how they characterize the experiment — without thinking of the repercussions.

The problem is that conflating Facebook’s experiment with the changes tech companies make to countless other algorithms every day is a bit like saying that a doctor shoving poison pills down a person’s throat is okay because other doctors were writing legitimate prescriptions at the same time. It’s a willful misrepresentation of a disconcerting experiment that should be discussed instead of being accepted as part of the modern, algorithm-driven world in which we’ve decided to live.

As more information about this study is revealed, the arguments used to defend it become less sustainable. Facebook didn’t just tweak an algorithm for a study that its users agreed to when they decided to create an account for the service; it conducted a psychological experiment on hundreds of thousands of people, some of whom might have been minors, without any form of consent. Now it’s trying to characterize the study as a simple attempt to improve its service — an argument that demonstrates the contempt it holds for every one of its 1.2 billion users.